US20100060606A1 - Hard tap - Google Patents
Hard tap Download PDFInfo
- Publication number
- US20100060606A1 US20100060606A1 US12/619,373 US61937309A US2010060606A1 US 20100060606 A1 US20100060606 A1 US 20100060606A1 US 61937309 A US61937309 A US 61937309A US 2010060606 A1 US2010060606 A1 US 2010060606A1
- Authority
- US
- United States
- Prior art keywords
- pressure
- tap
- function
- computer
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims description 39
- 230000003993 interaction Effects 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 abstract description 11
- 125000001475 halogen functional group Chemical group 0.000 description 31
- 230000000007 visual effect Effects 0.000 description 10
- 241000284156 Clerodendrum quadriloculare Species 0.000 description 7
- 241001422033 Thestylus Species 0.000 description 7
- 230000035945 sensitivity Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 210000003813 thumb Anatomy 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- CDFKCKUONRRKJD-UHFFFAOYSA-N 1-(3-chlorophenoxy)-3-[2-[[3-(3-chlorophenoxy)-2-hydroxypropyl]amino]ethylamino]propan-2-ol;methanesulfonic acid Chemical compound CS(O)(=O)=O.CS(O)(=O)=O.C=1C=CC(Cl)=CC=1OCC(O)CNCCNCC(O)COC1=CC=CC(Cl)=C1 CDFKCKUONRRKJD-UHFFFAOYSA-N 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 230000005055 memory storage Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000009530 blood pressure measurement Methods 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 210000003811 finger Anatomy 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- digitizers are able to detect the amount of pressure exerted by the user when making contact with a stylus. This information may be passed to the computing device in the form of a magnitude, perhaps as an eight-bit number. Typically, however, most operating systems, applications, and other software ignore this information, primarily interpreting contact pressure as a single click, regardless of magnitude. Notable exceptions include Adobe's Photoshop® and similar graphics programs, which use pressure on a digitizer tablet to simulate the varying strokes of a virtual paint brush.
- a first embodiment of the invention provides a computer-implemented method for adjusting a displayed control.
- An input from a digitizer, pressure sensitive mouse, etc. is received at a location corresponding to the displayed control.
- the amount of pressure applied by the user is determined, and the displayed control is adjusted depending on the amount of pressure applied.
- a third embodiment of the invention provides a computer-implemented method for performing a function responding to a tap.
- the tap is received and location, pressure, and length of time of the tap are analyzed to determine if the tap is a hard tap. If the pressure exceeds a certain threshold within a certain amount of time, the tap is found to be a hard tap, and a particular function is performed. Failing the test, a different function is performed.
- FIG. 1 an operating environment that may be used for one or more aspects of an illustrative embodiment of the invention.
- FIG. 2 is a plan view of a digitizer display of an illustrative computing device.
- FIG. 3 depicts a graph of pressure over time and visual feedback provided by an illustrative embodiment of the invention.
- FIG. 4 illustrates movement of a scrollbar provided by an illustrative embodiment of the invention.
- FIG. 5 illustrates movement of a scrollbar provided by an illustrative embodiment of the invention.
- FIG. 7 illustrates incrementing a spinner control in a manner provided by an illustrative embodiment of the invention.
- FIG. 8 illustrates resizing of an object in a manner provided by an illustrative embodiment of the invention.
- FIG. 9 illustrates resizing of an object in a manner provided by an illustrative embodiment of the invention.
- FIG. 10 is a flowchart for a method for adjusting a displayed control provided by an illustrative embodiment of the invention.
- FIG. 11 illustrates selecting text in a manner provided by an illustrative embodiment of the invention.
- FIG. 12 illustrates selecting drawing objects in a manner provided by an illustrative embodiment of the invention.
- FIG. 13 illustrates using encounter selection to select file and folder objects in a manner provided by an illustrative embodiment of the invention.
- FIG. 14 illustrates using an encounter selection to select file and folder objects in a manner provided by an illustrative embodiment of the invention.
- FIG. 17 illustrates movement of a scrollbar provided by an illustrative embodiment of the invention.
- FIG. 20 depicts a distance threshold for determining a type of tap provided by an illustrative embodiment of the invention.
- FIG. 21 depicts a graph of input pressure over time not resulting in a hard tap as provided by an illustrative embodiment of the invention.
- FIG. 22 depicts a graph of input pressure over time resulting in a hard tap as provided by an illustrative embodiment of the invention.
- FIG. 24 depicts various input pressure thresholds over time for determining a type of tap as provided by an illustrative embodiment of the invention.
- the invention is operational with numerous other general purpose or special purpose computing system environments or configurations.
- Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers (PCs); server computers; hand-held and other portable devices such as personal digital assistants (PDAs), tablet PCs or laptop PCs; multiprocessor systems; microprocessor-based systems; set top boxes; programmable consumer electronics; network PCs; minicomputers; mainframe computers; distributed computing environments that include any of the above systems or devices; and the like.
- PCs personal computers
- server computers hand-held and other portable devices
- PDAs personal digital assistants
- tablet PCs or laptop PCs multiprocessor systems
- microprocessor-based systems set top boxes
- programmable consumer electronics network PCs
- minicomputers minicomputers
- mainframe computers distributed computing environments that include any of the above systems or devices; and the like.
- illustrative computing system environment 100 includes a general purpose computing device in the form of a computer 110 .
- Components of computer 110 may include, but are not limited to, a processing unit 120 , a system memory 130 , and a system bus 121 that couples various system components including system memory 130 to processing unit 120 .
- System bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, Advanced Graphics Port (AGP) bus, and Peripheral Component Interconnect (PCI) bus, also known as Mezzanine bus.
- ISA Industry Standard Architecture
- MCA Micro Channel Architecture
- EISA Enhanced ISA
- VESA Video Electronics Standards Association
- AGP Advanced Graphics Port
- PCI Peripheral Component Interconnect
- Computer storage media includes, but is not limited to, random-access memory (RAM), read-only memory (ROM), electrically-erasable programmable ROM (EEPROM), flash memory or other memory technology, compact-disc ROM (CD-ROM), digital video disc (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 110 .
- Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF) such as BLUETOOTH standard wireless links, infrared and other wireless media.
- RF radio frequency
- the system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as ROM 131 and RAM 132 .
- a basic input/output system (BIOS) 133 containing the basic routines that help to transfer information between elements within computer 110 , such as during start-up, is typically stored in ROM 131 .
- RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120 .
- FIG. 1 illustrates software including operating system 134 , application programs 135 , other program modules 136 , and program data 137 .
- Computer 110 may also include other computer storage media.
- FIG. 1 illustrates a hard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152 , and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 such as a CD-ROM, DVD, or other optical media.
- Other computer storage media that can be used in the illustrative operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital video tape, solid state RAM, solid state ROM, and the like.
- the hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such as interface 140 , and magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150 .
- hard disk drive 141 is illustrated as storing an operating system 144 , application programs 145 , other program modules 146 , and program data 147 . Note that these components can either be the same as or different from operating system 134 , application programs 135 , other program modules 136 , and program data 137 , respectively.
- Operating system 144 , application programs 145 , other program modules 146 , and program data 147 are given different numbers in FIG. 1 to illustrate that, at a minimum, they are different copies.
- a user may enter commands and information into computer 110 through input devices such as a keyboard 162 and pointing device 161 , commonly referred to as a mouse, trackball or touch pad. Such pointing devices may provide pressure information, providing not only a location of input, but also the pressure exerted while clicking or touching the device.
- Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
- These and other input devices are often coupled to processing unit 120 through a user input interface 160 that is coupled to system bus 121 , but may be connected by other interface and bus structures, such as a parallel port, game port, universal serial bus (USB), or IEEE 1394 serial bus (FIREWIRE).
- a monitor 184 or other type of display device is also coupled to the system bus 121 via an interface, such as a video adapter 183 .
- Video adapter 183 may comprise advanced 2D or 3D graphics capabilities, in addition to its own specialized processor and memory.
- Computer 110 may also include a digitizer 185 to allow a user to provide input using a stylus 186 .
- Digitizer 185 may either be integrated into monitor 184 or another display device, or be part of a separate device, such as a digitizer pad.
- Computer 110 may also include other peripheral output devices such as speakers 189 and a printer 188 , which may be connected through an output peripheral interface 187 .
- Computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180 .
- Remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer 110 , although only a memory storage device 181 has been illustrated in FIG. 1 .
- the logical connections depicted in FIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173 , but may also include other networks.
- LAN local area network
- WAN wide area network
- Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
- computer 110 When used in a LAN networking environment, computer 110 is coupled to the LAN 171 through a network interface or adapter 170 .
- computer 110 may include a modem 172 or another device for establishing communications over WAN 173 , such as the Internet.
- Modem 172 which may be internal or external, may be connected to system bus 121 via user input interface 160 or another appropriate mechanism.
- program modules depicted relative to computer 110 may be stored remotely such as in remote storage device 181 .
- FIG. 1 illustrates remote application programs 182 as residing on memory device 181 . It will be appreciated that the network connections shown are illustrative and other means of establishing a communications link between the computers may be used.
- FIG. 2 illustrates a computing device that may be used with an embodiment of the invention.
- the computing device here is a tablet computer 201 , which minimally includes a computer display 202 with an integral digitizer 203 , and which may receive input via a user pressing a stylus 204 against the digitizer.
- Computer 110 may be embodied as tablet computer 201 .
- tablet computers are used throughout this document as illustrative computing devices, tablet computers are only one among many possible computing devices that may be used to implement the invention.
- Alternative embodiments may comprise, by way of example, personal computers (PCs), laptop computers, handheld computer such as personal digital assistants (PDAs), cell phones, home electronic equipment, or any other computing device having or being coupled to an input device that detects input pressure, such as a digitizer or a pressure-sensitive pointing device such as a pressure-sensitive mouse, pressure-sensitive trackball, or pressure-sensitive joystick.
- PCs personal computers
- PDAs personal digital assistants
- cell phones such as a digitizer or a pressure-sensitive pointing device such as a pressure-sensitive mouse, pressure-sensitive trackball, or pressure-sensitive joystick.
- pressure-sensitive is intended to refer to a pressure-sensitive input device that is capable of detecting (either directly or indirectly) and distinguishing between different amounts of applied input pressure, as opposed to merely being able to distinguish between input versus non-input.
- the digitizer 203 relays to computer 201 data representing both a two-dimensional location of the contact, as well as an amount of the pressure applied.
- the amount of pressure may be represented as a magnitude (e.g., a numeric value along a range of numeric values), a pressure category (e.g., light, medium, heavy, etc.), or in any other manner.
- Digitizer 203 may continuously update this information over time as stylus 204 moves around the display surface and as the contact pressure increases or decreases.
- Stylus 204 may be any type of stylus, such as a man-made object or a human body part, such as a fingertip.
- a man-made stylus may include, but are not limited to, a passive- or active-type pen-like stylus such as is conventionally provided with many PDAs and tablet computers.
- FIG. 3 An illustrative graph 301 of contact pressure over time is shown in FIG. 3 , which also depicts two forms of visual feedback provided on the display 202 by illustrative embodiments of the invention.
- stylus 204 is pressed against the surface of the display 202 , and thus against the integrated digitizer 203 as well.
- the contact pressure from the stylus 204 is gradually increased and then decreased over time, as depicted in graph 301 , which is typical of a tap of the stylus 204 against the digitizer 203 .
- the timeframe spanned by graph 301 may be just a fraction of a second, although the timeframe may be shorter or longer.
- the magnitude or other representation of the applied pressure may be sampled over time at various sample moments.
- stylus 204 has just started pressing against digitizer 203 at point 311 .
- the slight depression in the surface of digitizer 203 is detected as representative of the applied pressure, and the amount of pressure is shown as pressure magnitude 312 along graph 301 .
- This value may be passed to the computer (not shown), which may modify its behavior depending on the pressure, and also may provide feedback to a user on the display.
- Pressure may be detected in any of a number of ways, such as by directly measuring the pressure or by estimating the pressure in accordance with other variables, such as the amount of surface area physically affected by the pressing of the tip of the stylus 204 against the digitizer 203 .
- Computer 110 may provide visual, tactile, and/or audible feedback to the user in accordance with the amount of pressure applied by the stylus 204 .
- the pressure feedback provided by computer 110 may thus take many forms, any of which may alert the user to the level of pressure currently being exerted.
- visual forms of feedback may involve modifying the shape, color, size, or transparency of a cursor or an underlying control presented on display 202 .
- visual feedback may take the form of a pressure gauge, which may be depicted at a fixed location on display 202 .
- Audible forms of feedback may involve producing a series of clicks or varying the pitch of a sound in conjunction with the changing pressure.
- Tactile forms of feedback may include one or more intensities or frequencies of vibration of stylus 204 or a housing of computer 110 .
- FIG. 3 depicts two illustrative visual embodiments for providing pressure feedback to a user, but there are many other ways to provide this information.
- the value of pressure magnitude 312 is relayed to computer 110 , which may display a cursor 313 at a location corresponding to the point of contact of stylus 204 on the display.
- the point of arrow 317 is surrounded by cursor halo 314 , the shape and/or size (e.g., diameter) of which is dependent upon the current amount pressure being applied.
- computer 110 may retain the appearance of a cursor 315 , and instead modify the appearance of an underlying displayed control 316 , which is exemplified here as a button.
- cursors may vary, and can include hourglasses, pinwheels, insertion points, and so forth.
- a host of underlying displayed elements can be placed under the cursor and be similarly modified.
- stylus 204 further depresses digitizer 203 at point 321 , which is recorded as pressure magnitude 322 .
- Point 321 may be in the same location on digitizer 203 as point 311 or a different location.
- cursor halo 324 expands or otherwise changes size and/or shape to indicate the higher pressure to the user.
- the color of the button 326 darkens (or otherwise changes its appearance) noticeably in response to varying levels of applied pressure.
- digitizer 203 at point 331 is further depressed, registering as pressure magnitude 332 .
- point 331 may be the same location as points 311 and/or 321 , or may be a different location, as will be discussed further.
- cursor halo 334 expands further to reflect the increase in pressure, or, in the second example, feedback button 336 darkens considerably.
- stylus 204 is beginning to lift away from digitizer 203 at point 341 , which registers as pressure magnitude 342 .
- cursor halo 344 begins to deflate in size, or, in the second example, button 346 begins to lighten in color or otherwise alter its appearance in a manner different from when the pressure was increasing.
- point 341 may be the same location as points 311 , 321 , and/or 331 , or may be in a different location.
- pressure feedback may not be desirable if and when computer 110 is not presently using the pressure information.
- a first application may utilize pressure information but a second application may not.
- pressure feedback may be provided.
- second application is being executed and/or is in focus
- pressure feedback may not be provided.
- conventional cursors and controls may be used both when pressure is and is not relevant. Feedback, such as the inclusion of a halo around a cursor, may therefore not only provide information on the current level of pressure being applied, but may also relay the fact that pressure is presently relevant. By viewing a cursor halo, or a color varying control, the user may be informed that additional input can be provided through the use of pressure. This selective providing of pressure feedback may allow controllable pressure input to be an easily-discoverable feature even for an amateur user, through simple trial and error.
- Calibration may be an integral part of pressure sensitive input. Differing digitizers may translate force differently. As such, an operating system or application which allows for the use of pressure input may provide a calibration routine or pressure settings dialog. This may assist in standardizing input from various types of hardware. Additionally, it may allow weak or disabled individuals to vary the sensitivity of pressure measurements to accommodate their strength.
- pressure information may be used to distinguish between which window receives an input.
- a softer contact may indicate a normal input to the current window, whereas a harder contact may be directed to another application, such as a media player playing in the background.
- Enabling such uses of pressure enable a power user to work more efficiently by providing a larger input “vocabulary” without needlessly complicating the interface.
- average users who opt not to use pressure or users of devices without pressure sensitivity will not see their experience degrade.
- an input device may detect the surface area of contact, the temperature at the point of contact, or dampness at the point of contact. Each may be used to supplement an input and control device behavior.
- a device receiving pressure information uses that information to augment the input it receives.
- This information can be used in a wide variety of ways, including pressure-sensitive controls, pressure-sensitive selection, and through the use of a variable-pressure tap, as will be discussed in detail.
- Controls in a graphical user interface present intuitive input metaphors that allow users to manipulate and maneuver through data. Such controls are common elements found in graphical computer operating systems, individual applications, and other forms of software. Examples of known controls include, but are not limited to, scrollbars, spinner controls, resize handles, checkboxes, pull down menus, and buttons.
- controls may be augmented through the use of pressure data, providing users with an additional way to interact with displayed controls. For example, by making controls pressure-sensitive, users may be presented with a more manipulable user interface that responds faster and more accurately to their commands and that may more closely operate in accordance with the user's intent.
- Value-based controls i.e., controls that manipulate an underlying numeric, alphabetic, or alphanumeric value
- repeating controls i.e., controls that repeat the same action when continuously selected, may also be better served by the addition of pressure sensitivity.
- pressure sensitive controls are set forth below.
- FIG. 4 illustrates movement of a vertical scrollbar 403 provided by an illustrative embodiment of the invention.
- Scrollbar 403 graphically displays and updates an underlying index value indicating a current position within a corresponding document 404 .
- FIG. 4 is divided into two arbitrary “frames” 410 , 411 for explanatory purposes only.
- the frames 410 , 411 show the same displayed graphical user interface as it changes over time, starting with portion 410 and ending with portion 411 .
- a cursor 401 indicates that the user is presently activating the down arrow button of vertical scrollbar 403 .
- Scrollbar 403 indicates the vertical placement of document 404 according to the vertical location of a thumb control 405 .
- the size of cursor halo 402 signifies a relatively small amount of pressure being applied.
- the user has continued to activate the down arrow button, and the displayed portion of document 404 has scrolled down (which is actually implemented by moving document 404 up), as indicated by the new location of thumb control 405 . Because of the small amount pressure applied by the user, the computer scrolls scrollbar 403 relatively slowly.
- FIG. 5 illustrates the same embodiment depicted in FIG. 4 , and like FIG. 4 is also divided into two time-lapsed frames 510 , 511 , where the amount of time that passes between frames 510 and 511 is the same as the amount of time that passes between frames 410 and 411 .
- scrollbar 403 is at the same starting point as in frame 410 , but this time, the user is pressing harder with stylus 204 , as indicated by a larger cursor halo 502 .
- thumb control 405 indicates that the displayed portion of document 404 has scrolled down further than it did in FIG. 4 .
- scrolling speed of scrollbar 403 is dependent upon the amount of pressure applied to the down arrow button of scrollbar 403 .
- the same may apply to the up arrow button of scrollbar 403 , as well as the left and right arrow buttons of a horizontal scrollbar.
- the scrolling speed of scrollbar 403 may have any relationship to the applied pressure, such as a linear relationship or a nonlinear relationship.
- FIG. 6 illustrates incrementing a spinner control 601 provided by an illustrative embodiment of the invention.
- Spinner control 601 has an entry field 602 , where a user can view or change a numeric value, and up arrow button 603 and down arrow button 604 , the selection of which respectively increments or decrements the value in the entry field.
- FIG. 6 is also divided into frames 610 , 611 , 612 , which show the same graphical user interface changing over time from left to right in that figure.
- entry field 602 contains an initial value 0. Once a user activates up arrow button 603 , the value in entry field 602 begins to increment.
- this activation is indicated by arrow cursor 605 with cursor halo 606 .
- the value in entry field 602 has increased at a rate and/or increment (e.g., incrementing by 1) that depends upon the applied pressure.
- the increase continues at a rate and/or increment that depend upon the applied pressure.
- FIG. 6 it is assumed that the same amount of time has elapsed between frame 610 and 611 as between frames 611 and 612 , and for simplicity that the applied pressure has remained constant throughout frames 610 - 612 .
- FIG. 7 illustrates the same embodiment depicted in FIG. 6 , and like FIG. 6 is also divided into three time-lapsed frames 710 , 711 , 712 , where the amount of time that passes between frames 710 and 711 is the same as the amount of time that passes between frames 610 and 611 , and the amount of time that passes between frames 711 and 712 is the same as the amount of time that passes between frames 611 and 612 .
- spinner control 601 has been reset to the same initial value as in frame 610 .
- second frame 711 when greater pressure is applied to activate up arrow button 603 , the value in entry field 602 is updated using a larger increment and/or at a greater rate than in FIG. 6 .
- the higher pressure is indicated by the larger cursor halo 706 .
- the higher pressure continues updating the value using the larger increment and/or at a greater rate.
- the value of a spinner control may increase or decrease by an increment and/or at a rate that depends linearly or nonlinearly upon the applied pressure.
- pressure sensitivity may be used to control the rate of adjustment of a value or index.
- the pressure applied remained constant, that does not need to be the case. Indeed, a user may continuously and dynamically increase and decrease the pressure applied when providing an input in order to affect the operation of the control. As the pressure changes, so does the affect on the control, such as the rate of change.
- the value need not be numeric, and could be chosen from any defined set, such as a set of characters, words, colors, patterns, and so forth, depending on the particular need.
- a color selection can be made by using pressure to speed and slow the rate of change among a selection of colors and/or to control the intensity or other property of a color.
- Another embodiment may involve selecting a state in a pull down menu and using pressure to speed and slow the movement of the list.
- FIG. 8 illustrates resizing a drawing object 801 using an illustrative embodiment of the invention.
- the context here may be a drawing application or other program that allows users to resize objects.
- drawing object 801 is being resized by a user.
- Arrow cursor 803 having cursor halo 804 is positioned over resize handle 802 at one corner of object 801 .
- Light pressure is applied by the user, as indicated by the small cursor halo 804 .
- the application allows the user to resize following a smooth, freeform path.
- resize handle 802 has been repositioned at location 805 .
- FIG. 9 illustrates the same embodiment of the invention depicted in FIG. 8 .
- the user is applying more pressure, as indicated by larger cursor halo 904 .
- the application may cause the movement of resize handle 802 to be non-smooth, such as by constraining it to fixed positions on a predetermined grid, for example.
- resize handle 802 is relocated at location 905 .
- resize handle 802 is limited to locations along a fixed grid, which may or may not be visible to the user.
- the amount of pressure applied may affect the grid increment by which object 801 can be resized.
- an operating system or application may reverse the relationship, and constrain resize movement to a grid when less pressure is applied and allowing freeform resizing only when more pressure is applied.
- the pressure-sensitive control (i.e., resize handle 802 ) disclosed in FIGS. 8 and 9 is not necessarily limited to resizing objects. It may apply to other controls such as for moving objects, for example. It may also be used for resizing or repositioning windows in an operating system or application. Ultimately, the technique can be used in any instance where values defining an object's screen position, shape, size, and/or any other property are being modified. Furthermore, other forms of pressure-sensitive controls may be implemented. For example, a button or other control that performs an action when activated and that repeats the action when continuously activated may be pressure-sensitive.
- Such a repeating control may repeat the action faster when greater pressure is applied to the input, and likewise slow down the rate of repetition when less pressure is applied.
- An additional implementation involves the use of harder pressure to vary the behavior of a drag operation. For example, dragging a drawing object while pressing harder may cause the object to paste in repeated fashion, similar to a rubber stamp effect; the harder the drag, the more frequent the repeating stamps.
- FIG. 10 illustrates a method for adjusting a displayed control provided by an illustrative embodiment of the invention.
- computer 110 displays a pressure sensitive control on display 202 , for example a scrollbar, resize handle, button, spinner control, and so forth.
- a user directs input to the control, coupled with pressure applied by the user with stylus 204
- computer 110 receives the input, as in step 1002 .
- computer 110 determines the amount of pressure applied by the user.
- computer 110 may, at this point, compare the pressure applied to an amount applied previously in order to determine its relative magnitude.
- computer 110 moves or modifies the respective control (including adjusting any underlying value) in accordance with the amount of pressure applied.
- a greater amount of pressure from the user may cause adjustments to the control carried out in a first manner, perhaps by speeding up or using larger increments, whereas less pressure may cause adjustments to occur in a second manner, such as slower adjustments, or even cause the control to behave as if not pressure-sensitive.
- computer 110 determines whether the user input continues, and if so, performs steps 1002 - 1005 again. Otherwise, the method terminates and/or awaits another user input.
- Item selection is a common activity for users of graphical computer operating systems, applications, and other software, and activity that can benefit from pressure sensitivity. Selecting choices from a list, words in a document, files in a folder, etc. are tasks with which most users are familiar. Pressure sensitivity enhances the process of item selection, for example when the use of double or triple mouse clicks are needed to broaden a selection but no mouse is available, such as with a tablet computer. For example, by pressing harder, a user signals that he wants to select a larger number of items. A user need not attempt a notoriously difficult double or triple click with a stylus on a tablet computer.
- FIG. 11 illustrates selecting text in a word processing application in a manner provided by an illustrative embodiment of the invention.
- FIG. 11 is divided into three frames 1100 , 1110 , 1120 .
- Frames 1100 , 1110 , and 1120 illustrate how a graphical user interface may react to different applied pressures.
- a word 1102 in paragraph 1101 is being selected by cursor 1103 having cursor halo 1104 .
- cursor 1103 having cursor halo 1104 .
- a user applies a small amount of pressure to select word 1101 .
- the user may simply place an insertion point in the middle of word 1102 .
- FIG. 12 illustrates selecting drawing objects in a drawing software program in a manner provided by an illustrative embodiment of the invention.
- FIG. 12 is divided into three frames 1200 , 1210 , 1220 that illustrate how a graphical user interface may react to different applied pressures.
- drawing object 1201 is selected by a user using a small amount of pressure, as indicated by cursor 1202 having a small cursor halo 1203 .
- Selected object 1201 may be surrounded by a selection tool embodied as a selection border 1204 .
- Each drawing object within the selection border 1204 is part of the selection.
- the size of the selection border 1204 depends upon the amount of applied pressure. As indicated by the size of cursor halo 1203 , the user is pressing lightly in order to select the object currently under cursor 1202 . As the user presses harder, in second frame 1210 , selection border 1204 grows in accordance with the higher applied pressure, and in this case grows sufficiently large so as to encompass more objects, including, for example, object 1215 . Cursor halo 1213 reflects the increasing pressure applied by the user. As the user presses harder still, in third and final frame 1220 , selection border 1204 grows larger still, encompassing more drawing objects in this case including, for example, object 1226 .
- Selection border 1204 may be constrained to grow to encompass only objects connected or adjacent to the originally-selected object, or may be configured to also encompass objects not connected or adjacent to the originally-selected object. As before, reducing the pressure applied may return the selection to a smaller number of objects.
- U.S. Patent Application Publication No. 20040021701 A1 entitled “Freeform Encounter Selection Tool,” which is hereby incorporated by reference as to its disclosure of an encounter selection tool, discloses a freeform encounter selection tool for a computer system with a graphical user interface that allows a user to draw a freeform selection path so as to select one or more items along or near the path. As the user drags a pointing device such as a stylus or a mouse, a freeform selection path is created so that the encounter selection tool selects graphical items that are encountered.
- a pointing device such as a stylus or a mouse
- FIG. 13 illustrates using an encounter selection tool to select file and folder objects by an illustrative embodiment of the invention.
- a subset of a collection 1301 of files and folders is selected by dragging cursor 1302 , having cursor halo 1303 , from start point 1304 to end point 1306 .
- Folders and files encountered along the path of the cursor, for example folder 1305 are selected and highlighted as shown.
- the user may then perform a common function among all of the files, such as throwing them away.
- the cursor halo the user is only pressing lightly when he uses the encounter select tool. This leads to a relatively narrow selection path. In this embodiment, the lighter the pressure, the narrower the selection path, and thus in general the fewer the number of objects selected along the selection path.
- FIG. 14 illustrates the same collection 1301 of files and folders presented in FIG. 13 .
- the user presses harder while moving cursor 1302 from start point 1304 to end point 1306 , as reflected in larger cursor halo 1403 .
- the result of the increased pressure is that a wider selection path is created and a larger number of objects is selected, including, for example, document 1405 .
- this embodiment has been discussed with regard to a freeform selection path that follows movement of the cursor 1306 , the selection path may take other forms such as a linear path that extends between the start point 1304 and the end point 1306 , regardless of the path the cursor 1306 takes between the start and end points 1304 , 1306 .
- FIG. 15 illustrates one method for selecting displayed objects provided by an illustrative embodiment of the invention.
- step 1501 a collection of selectable items such as characters, drawing objects, file icons, etc. is displayed.
- a user selects at least one of the items with stylus 204 , and the stylus input is received in step 1502 .
- step 1503 the amount of pressure applied by the user with the stylus 204 is determined so that, in step 1504 , the identities of which items are selected can be modified. For example, when a user presses harder during a selection operation, the selection path widens, and so more of the items may be selected. If there is more input from the user, at decision 1505 , the steps continue. Otherwise, the method terminates normally or awaits further user input.
- a lasso selection tool which is familiar from its use in graphics editing software, may be enhanced with pressure information.
- lasso selection allows a user to encircle in freeform fashion a graphic of interest and cut or copy it.
- a user by pressing harder while encircling a selected graphic of interest, may control the amount of feathering used to soften the edges of the graphic of interest when it is cut or copied, or whether to select objects only partially within the lasso (e.g., lower pressure does not select such objects while higher pressure does select such objects).
- pressure based selection may allow for a selection zoom. For example, while pressing harder on a pixel to select it, the screen may zoom in further allowing greater detail to be displayed. The user can then achieve more precision while selecting neighboring pixels.
- Tapping a digitizer is a common form of user interaction with a computing device, such as a tablet computer, which can be enhanced by exploiting pressure information available from the digitizer.
- the term tap includes the contact and removal of a stylus, such as a pen, finger, or any other pointing implement, with the surface of the digitizer.
- a tap is interpreted as equivalent to a mouse click, regardless of how much force was used to impact the digitizer.
- Pressure information can be used, however, to distinguish normal taps from hard taps, enabling new sources of user input. For example, a tap with an applied pressure within a given first pressure range may be considered a normal tap, whereas a tap with an applied pressure within a given higher second pressure range may be considered a hard tap.
- a normal tap may, for example, be interpreted as a simple click
- a hard tap may, for example, be used to trigger additional functionality.
- a hard tap may be interpreted as a double-tap (notoriously difficult on digitizer displays), or as a right click, or as a trigger for an on-screen keyboard for tapping out words and sentences, or as a request to launch an application, or as a middle click (on a three button mouse), or as a scroll wheel click, and so forth.
- FIGS. 16 and 17 illustrate one embodiment of hard taps.
- FIGS. 16 and 17 are each divided into two arbitrary frames 1610 , 1611 , 1710 , 1711 that show how a displayed graphical user interface is affected by different types of taps.
- FIG. 16 depicts movement of a scrollbar under conditions of a normal tap provided by an illustrative embodiment of the invention.
- first frame 1610 a user taps on scrollbar 1603 , the tap being indicated by the temporary placement of cursor 1601 and starburst halo 1602 .
- the starburst halo 1602 may signify to the user that a tap is being received as opposed to a press-and-hold.
- the small starburst halo 1602 indicates, in this case, that the tap was not very hard.
- the results of the tap are viewable.
- Document 1604 has scrolled down one page, and the thumb control 1605 has shifted down.
- First frame 1710 of FIG. 17 illustrates the same starting position of document 1604 and thumb control 1605 .
- a user taps the same location on scrollbar 1603 as before, but this time taps harder.
- the starburst halo 1702 that temporarily appears is larger, indicating that a harder tap was registered than in FIG. 16 .
- the hard tap in this case triggers a different function.
- thumb control 1605 jumps directly to the location of the harder tap. This can be useful for a user who wants to go directly to a portion of document 1604 without waiting for the scrollbar to page down.
- FIGS. 18 and 19 illustrate a second embodiment of hard and normal taps.
- the figures are each divided into two arbitrary frames 1810 , 1811 , 1910 , 1911 that show how the effect of different types of taps.
- first frame 1810 of FIG. 18 a file 1801 receives a single normal tap, as signified by cursor 1802 with starburst halo 1803 .
- second frame 1811 computer 110 interprets the normal tap as a left click and file 1804 is highlighted.
- the first frame 1910 of FIG. 19 differs from that of FIG. 18 in that file 1801 receives a hard tap, as signified by larger starburst halo 1903 .
- computer 110 performs a separate action in second frame 1911 , treating the hard tap as a right click, and displaying the context sensitive menu rather than selecting the file.
- variable-pressure tap embodiments described above are only a few of the uses of such an enhancement.
- the starburst halo described above is merely demonstrative.
- other forms of visual feedback include changing the color or transparency of the point of impact rather than changing the cursor.
- audio feedback may distinguish a hard tap from another tap or input.
- the existence of a particular sound, or the volume of a sound, or a particular pitch may provide the feedback needed by a user to distinguish tap types.
- the methods for evaluating a hard tap and distinguishing hard taps from other forms of input are set forth in some detail below.
- Distinguishing a hard tap from other user inputs on a digitizer may involve determining whether the tip of stylus 204 maintains a substantially constant position on the surface of digitizer 203 .
- a tap that moves across the surface of digitizer 203 is more likely an intended drag operation rather than a tap, and so a distance threshold may be used to ensure that the point of contact does not move too far.
- Such a threshold is depicted in FIG. 20 , which provides an illustrative layout for a distance threshold on an X-Y coordinate plane defining locations on the surface of digitizer 203 .
- the initial point of contact is represented by the black unit square in the middle of the figure. Unit squares in FIG.
- Each unit square (or other shaped area) may represent a pixel on the underlying display and/or a minimum resolvable area that may be sensed by digitizer 203 , or any other unit of area, whether arbitrary or whether defined in accordance with properties of display 202 and/or digitizer 203 .
- the immediately adjacent squares or pixels (cross-hatched in FIG. 18 ) form a distance threshold.
- stylus 204 may initially impact the pressure-sensitive surface of digitizer 203 at the black square, and stylus 204 may immediately thereafter slide just a little bit.
- This distance threshold is configurable. If a user has a difficult time holding a pointing implement steady, she may be able to adjust the distance threshold, such as by increasing the distance threshold to include a larger range of acceptable contact points.
- FIG. 21 depicts a graph 2101 of input pressure over time not resulting in a hard tap as provided by an illustrative embodiment of the invention.
- the tap has stayed within the appropriate distance threshold, but by the time the contact passes time threshold 2102 (perhaps set at 1 ⁇ 4 of a second or another amount of time) at point 2104 , the magnitude of pressure has not passed pressure threshold 2103 .
- computer 110 may interpret the input as a normal tap or some other type of input.
- FIG. 22 depicts a graph 2201 of input pressure over time resulting in a hard tap as provided by an illustrative embodiment of the invention.
- the contact being graphed has surpassed a pressure threshold 2203 at a point 2204 before reaching a time threshold 2202 .
- This may therefore be registered as a hard tap, and the operating system or application can process it as such. If the operating system or application does not take advantage of hard taps, then the tap can be interpreted as a normal tap.
- FIG. 23 depicts a graph 2301 of input pressure over time in which the tap input does not register as a hard tap as provided by an illustrative embodiment of the invention.
- the contact with the surface of digitizer 203 eventually exceeds a pressure threshold 2303 , but not within a time threshold 2302 .
- time threshold 2302 When the time of contact exceeds time threshold 2302 at a point 2304 , similar to the example in FIG. 21 , the input can be passed on as something other than a hard tap, such as a normal tap.
- time and pressure thresholds used to detect a hard tap may be user- or software-configurable. Individual users can perfect their successful use of the hard tap by adjusting the pressure threshold and time threshold. For example, a particular user may not be able to achieve the pressure magnitude needed, and can adjust the pressure and/or time thresholds as desired, such as to adjust the pressure threshold lower to allow for “softer” hard taps. Pressure and/or time thresholds may further be automatically adjusted through a calibration routine.
- computer 110 via a user interface, may request that the user execute what the user considers to be a normal tap as well as a hard tap.
- Computer 100 may measure the pressure and time properties of the user's input and automatically determine appropriate time and/or pressure thresholds in accordance with those properties.
- FIG. 24 depicts a set of pressure ranges over time as provided by an illustrative embodiment of the invention.
- taps not exceeding a pressure threshold 2401 within time threshold 2403 will be considered normal taps.
- Taps exceeding a pressure threshold 2401 , but not exceeding a higher pressure threshold 2402 within time threshold 2403 will be interpreted as medium taps.
- taps exceeding pressure threshold 2402 will be interpreted as hard taps.
- Medium taps may be useful in some interface environments. Individual applications may provide different thresholds depending on their need, overriding thresholds set by the operating system.
- FIG. 25 us a flowchart for a method for responding to a user interaction provided by an illustrative embodiment of the invention.
- computer 110 receives a stylus input upon digitizer 203 , providing data about the tap including location and pressure over time. Given this information, at decision 2502 , computer 110 determines whether the contact location has moved within a predetermined threshold distance from its initial point. If not, then no tap is found, as the input may be the beginning of a drag operation, or possibly handwriting. Alternatively, computer 110 may decide that such an input is a normal tap. If stylus 204 remains within the threshold distance during the input, then at decision 2503 , computer 110 determines whether the applied pressure exceeds a predetermined threshold pressure within a predetermined time threshold. If not, then a normal tap is detected and the appropriate function is executed at step 2505 . If the threshold pressure was reached within the time threshold, then a hard tap is detected and the appropriate function is executed at step 2504 .
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a continuation of and claims benefit of priority to U.S. patent application Ser. No. 11/017,073, filed on Dec. 21, 2004.
- This invention relates generally to managing input to computers. More particularly, the invention allows for acting on a user's input in accordance with the physical pressure exerted by the user when providing the input.
- In the field of computing, stylus-based input is growing more prevalent with the increasingly widespread use of computing devices such as personal digital assistants (PDAs), tablet computers, laptop computers, and the like. The input devices used to detect contact and position by a stylus shall hereinafter be referred to as digitizers. A digitizer may be attached to or otherwise integrated with the display surface of a display device. The display device may further be separate from or integral with a computing device. For example, a tablet computer typically has an integral display device with a digitizer. Alternatively, a digitizer may take the form of an external peripheral such as a digitizing tablet. Some types of digitizers are able to translate a stylus contact with the digitizer into a two-dimensional position that corresponds to coordinates on the display surface closest to where the tip of the stylus came into contact.
- Along with position, digitizers are able to detect the amount of pressure exerted by the user when making contact with a stylus. This information may be passed to the computing device in the form of a magnitude, perhaps as an eight-bit number. Typically, however, most operating systems, applications, and other software ignore this information, primarily interpreting contact pressure as a single click, regardless of magnitude. Notable exceptions include Adobe's Photoshop® and similar graphics programs, which use pressure on a digitizer tablet to simulate the varying strokes of a virtual paint brush.
- It would be an enhancement for graphical interface users to take full advantage of pressure as an input throughout an operating system, applications, and/or other software. It would also be an enhancement for graphical interface users to develop faster and more accurate control of interface functions using existing hardware. Further, it would be an enhancement for graphical interface users to achieve additional intuitive functionality from existing input devices without confusing users with additional buttons or switches.
- The following presents a simplified summary in order to provide a basic understanding of some aspects of the invention. The summary is not an extensive overview of the invention. It is neither intended to identify key or critical elements of the invention nor to delineate the scope of the invention. The following summary merely presents some concepts of the invention in a simplified form as a prelude to the more detailed description below.
- A first embodiment of the invention provides a computer-implemented method for adjusting a displayed control. An input from a digitizer, pressure sensitive mouse, etc. is received at a location corresponding to the displayed control. The amount of pressure applied by the user is determined, and the displayed control is adjusted depending on the amount of pressure applied.
- A second embodiment of the invention provides a computer-implemented method for responding to a user interaction. A tap is received upon a display device, and it is determined whether the tap was a hard tap. If a hard tap, one function is performed, but if not, a separate function is performed.
- A third embodiment of the invention provides a computer-implemented method for performing a function responding to a tap. The tap is received and location, pressure, and length of time of the tap are analyzed to determine if the tap is a hard tap. If the pressure exceeds a certain threshold within a certain amount of time, the tap is found to be a hard tap, and a particular function is performed. Failing the test, a different function is performed.
- A fourth embodiment of the invention provides a computer-implemented method for interacting with displayed objects. When receiving a selection of objects via pressure enhanced input, the amount of pressure plays a role in determining the quantity/type/etc. of objects to be selected.
- A more complete understanding of the present invention and the advantages thereof may be acquired by referring to the following description in consideration of the accompanying drawings, in which like reference numbers indicate like features, and wherein:
-
FIG. 1 an operating environment that may be used for one or more aspects of an illustrative embodiment of the invention. -
FIG. 2 is a plan view of a digitizer display of an illustrative computing device. -
FIG. 3 depicts a graph of pressure over time and visual feedback provided by an illustrative embodiment of the invention. -
FIG. 4 illustrates movement of a scrollbar provided by an illustrative embodiment of the invention. -
FIG. 5 illustrates movement of a scrollbar provided by an illustrative embodiment of the invention. -
FIG. 6 illustrates incrementing a spinner control in a manner provided by an illustrative embodiment of the invention. -
FIG. 7 illustrates incrementing a spinner control in a manner provided by an illustrative embodiment of the invention. -
FIG. 8 illustrates resizing of an object in a manner provided by an illustrative embodiment of the invention. -
FIG. 9 illustrates resizing of an object in a manner provided by an illustrative embodiment of the invention. -
FIG. 10 is a flowchart for a method for adjusting a displayed control provided by an illustrative embodiment of the invention. -
FIG. 11 illustrates selecting text in a manner provided by an illustrative embodiment of the invention. -
FIG. 12 illustrates selecting drawing objects in a manner provided by an illustrative embodiment of the invention. -
FIG. 13 illustrates using encounter selection to select file and folder objects in a manner provided by an illustrative embodiment of the invention. -
FIG. 14 illustrates using an encounter selection to select file and folder objects in a manner provided by an illustrative embodiment of the invention. -
FIG. 15 is a flowchart for a method for selecting displayed objects provided by an illustrative embodiment of the invention. -
FIG. 16 illustrates movement of a scrollbar provided by an illustrative embodiment of the invention. -
FIG. 17 illustrates movement of a scrollbar provided by an illustrative embodiment of the invention. -
FIG. 18 illustrates selection of a file provided by an illustrative embodiment of the invention. -
FIG. 19 illustrate displaying a context sensitive menu provided by an illustrative embodiment of the invention. -
FIG. 20 depicts a distance threshold for determining a type of tap provided by an illustrative embodiment of the invention. -
FIG. 21 depicts a graph of input pressure over time not resulting in a hard tap as provided by an illustrative embodiment of the invention. -
FIG. 22 depicts a graph of input pressure over time resulting in a hard tap as provided by an illustrative embodiment of the invention. -
FIG. 23 depicts a graph of input pressure over time not resulting in a hard tap as provided by an illustrative embodiment of the invention. -
FIG. 24 depicts various input pressure thresholds over time for determining a type of tap as provided by an illustrative embodiment of the invention. -
FIG. 25 is a flowchart for a method for responding to a user interaction provided by an illustrative embodiment of the invention. - In the following description of various illustrative embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which are shown by way of illustration various embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized, and structural and functional modifications may be made, without departing from the scope and spirit of the present invention.
-
FIG. 1 illustrates an example of a suitablecomputing system environment 100 in which the invention may be implemented. Thecomputing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should computingenvironment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated inillustrative operating environment 100. - The invention is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers (PCs); server computers; hand-held and other portable devices such as personal digital assistants (PDAs), tablet PCs or laptop PCs; multiprocessor systems; microprocessor-based systems; set top boxes; programmable consumer electronics; network PCs; minicomputers; mainframe computers; distributed computing environments that include any of the above systems or devices; and the like.
- The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
- With reference to
FIG. 1 , illustrativecomputing system environment 100 includes a general purpose computing device in the form of acomputer 110. Components ofcomputer 110 may include, but are not limited to, aprocessing unit 120, asystem memory 130, and asystem bus 121 that couples various system components includingsystem memory 130 toprocessing unit 120.System bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, Advanced Graphics Port (AGP) bus, and Peripheral Component Interconnect (PCI) bus, also known as Mezzanine bus. -
Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed bycomputer 110 such as volatile, nonvolatile, removable, and non-removable media. By way of example, and not limitation, computer readable media may include computer storage media and communication media. Computer storage media may include volatile, nonvolatile, removable, and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, random-access memory (RAM), read-only memory (ROM), electrically-erasable programmable ROM (EEPROM), flash memory or other memory technology, compact-disc ROM (CD-ROM), digital video disc (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed bycomputer 110. Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF) such as BLUETOOTH standard wireless links, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media. - The
system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such asROM 131 andRAM 132. A basic input/output system (BIOS) 133, containing the basic routines that help to transfer information between elements withincomputer 110, such as during start-up, is typically stored inROM 131.RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processingunit 120. By way of example, and not limitation,FIG. 1 illustrates software includingoperating system 134,application programs 135,other program modules 136, andprogram data 137. -
Computer 110 may also include other computer storage media. By way of example only,FIG. 1 illustrates ahard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, amagnetic disk drive 151 that reads from or writes to a removable, nonvolatilemagnetic disk 152, and anoptical disk drive 155 that reads from or writes to a removable, nonvolatileoptical disk 156 such as a CD-ROM, DVD, or other optical media. Other computer storage media that can be used in the illustrative operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital video tape, solid state RAM, solid state ROM, and the like. Thehard disk drive 141 is typically connected to thesystem bus 121 through a non-removable memory interface such asinterface 140, andmagnetic disk drive 151 andoptical disk drive 155 are typically connected to thesystem bus 121 by a removable memory interface, such asinterface 150. - The drives and their associated computer storage media discussed above and illustrated in
FIG. 1 provide storage of computer-readable instructions, data structures, program modules and other data forcomputer 110. InFIG. 1 , for example,hard disk drive 141 is illustrated as storing anoperating system 144,application programs 145,other program modules 146, andprogram data 147. Note that these components can either be the same as or different fromoperating system 134,application programs 135,other program modules 136, andprogram data 137, respectively.Operating system 144,application programs 145,other program modules 146, andprogram data 147 are given different numbers inFIG. 1 to illustrate that, at a minimum, they are different copies. A user may enter commands and information intocomputer 110 through input devices such as akeyboard 162 andpointing device 161, commonly referred to as a mouse, trackball or touch pad. Such pointing devices may provide pressure information, providing not only a location of input, but also the pressure exerted while clicking or touching the device. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often coupled toprocessing unit 120 through auser input interface 160 that is coupled tosystem bus 121, but may be connected by other interface and bus structures, such as a parallel port, game port, universal serial bus (USB), or IEEE 1394 serial bus (FIREWIRE). Amonitor 184 or other type of display device is also coupled to thesystem bus 121 via an interface, such as avideo adapter 183.Video adapter 183 may comprise advanced 2D or 3D graphics capabilities, in addition to its own specialized processor and memory. -
Computer 110 may also include adigitizer 185 to allow a user to provide input using astylus 186.Digitizer 185 may either be integrated intomonitor 184 or another display device, or be part of a separate device, such as a digitizer pad.Computer 110 may also include other peripheral output devices such asspeakers 189 and aprinter 188, which may be connected through an outputperipheral interface 187. -
Computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as aremote computer 180.Remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative tocomputer 110, although only amemory storage device 181 has been illustrated inFIG. 1 . The logical connections depicted inFIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. - When used in a LAN networking environment,
computer 110 is coupled to theLAN 171 through a network interface oradapter 170. When used in a WAN networking environment,computer 110 may include amodem 172 or another device for establishing communications overWAN 173, such as the Internet.Modem 172, which may be internal or external, may be connected tosystem bus 121 viauser input interface 160 or another appropriate mechanism. In a networked environment, program modules depicted relative tocomputer 110, or portions thereof, may be stored remotely such as inremote storage device 181. By way of example, and not limitation,FIG. 1 illustratesremote application programs 182 as residing onmemory device 181. It will be appreciated that the network connections shown are illustrative and other means of establishing a communications link between the computers may be used. -
FIG. 2 illustrates a computing device that may be used with an embodiment of the invention. The computing device here is atablet computer 201, which minimally includes acomputer display 202 with anintegral digitizer 203, and which may receive input via a user pressing astylus 204 against the digitizer.Computer 110 may be embodied astablet computer 201. Although tablet computers are used throughout this document as illustrative computing devices, tablet computers are only one among many possible computing devices that may be used to implement the invention. Alternative embodiments may comprise, by way of example, personal computers (PCs), laptop computers, handheld computer such as personal digital assistants (PDAs), cell phones, home electronic equipment, or any other computing device having or being coupled to an input device that detects input pressure, such as a digitizer or a pressure-sensitive pointing device such as a pressure-sensitive mouse, pressure-sensitive trackball, or pressure-sensitive joystick. The term “pressure-sensitive” is intended to refer to a pressure-sensitive input device that is capable of detecting (either directly or indirectly) and distinguishing between different amounts of applied input pressure, as opposed to merely being able to distinguish between input versus non-input. - Returning to
FIG. 2 , whenstylus 204 comes in contact with the surface of the tablet computer'sdisplay 202, thedigitizer 203 relays tocomputer 201 data representing both a two-dimensional location of the contact, as well as an amount of the pressure applied. The amount of pressure may be represented as a magnitude (e.g., a numeric value along a range of numeric values), a pressure category (e.g., light, medium, heavy, etc.), or in any other manner.Digitizer 203 may continuously update this information over time asstylus 204 moves around the display surface and as the contact pressure increases or decreases. -
Stylus 204 may be any type of stylus, such as a man-made object or a human body part, such as a fingertip. A man-made stylus may include, but are not limited to, a passive- or active-type pen-like stylus such as is conventionally provided with many PDAs and tablet computers. - An
illustrative graph 301 of contact pressure over time is shown inFIG. 3 , which also depicts two forms of visual feedback provided on thedisplay 202 by illustrative embodiments of the invention. Here,stylus 204 is pressed against the surface of thedisplay 202, and thus against theintegrated digitizer 203 as well. In this example, the contact pressure from thestylus 204 is gradually increased and then decreased over time, as depicted ingraph 301, which is typical of a tap of thestylus 204 against thedigitizer 203. To give an impression of scale, the timeframe spanned bygraph 301 may be just a fraction of a second, although the timeframe may be shorter or longer. The magnitude or other representation of the applied pressure may be sampled over time at various sample moments. For example, at a first moment in time,stylus 204 has just started pressing againstdigitizer 203 atpoint 311. The slight depression in the surface ofdigitizer 203 is detected as representative of the applied pressure, and the amount of pressure is shown aspressure magnitude 312 alonggraph 301. This value may be passed to the computer (not shown), which may modify its behavior depending on the pressure, and also may provide feedback to a user on the display. Pressure may be detected in any of a number of ways, such as by directly measuring the pressure or by estimating the pressure in accordance with other variables, such as the amount of surface area physically affected by the pressing of the tip of thestylus 204 against thedigitizer 203. -
Computer 110 may provide visual, tactile, and/or audible feedback to the user in accordance with the amount of pressure applied by thestylus 204. The pressure feedback provided bycomputer 110 may thus take many forms, any of which may alert the user to the level of pressure currently being exerted. For example, visual forms of feedback may involve modifying the shape, color, size, or transparency of a cursor or an underlying control presented ondisplay 202. Alternatively, visual feedback may take the form of a pressure gauge, which may be depicted at a fixed location ondisplay 202. Audible forms of feedback may involve producing a series of clicks or varying the pitch of a sound in conjunction with the changing pressure. Tactile forms of feedback may include one or more intensities or frequencies of vibration ofstylus 204 or a housing ofcomputer 110. -
FIG. 3 depicts two illustrative visual embodiments for providing pressure feedback to a user, but there are many other ways to provide this information. Referring again to the first moment in time ingraph 301, the value ofpressure magnitude 312 is relayed tocomputer 110, which may display acursor 313 at a location corresponding to the point of contact ofstylus 204 on the display. Here, the point ofarrow 317 is surrounded bycursor halo 314, the shape and/or size (e.g., diameter) of which is dependent upon the current amount pressure being applied. Alternatively,computer 110 may retain the appearance of acursor 315, and instead modify the appearance of an underlying displayedcontrol 316, which is exemplified here as a button. It should be noted that cursors may vary, and can include hourglasses, pinwheels, insertion points, and so forth. In addition, a host of underlying displayed elements can be placed under the cursor and be similarly modified. - Moving rightward across
FIG. 3 , at a second moment in time ingraph 301,stylus 204 further depresses digitizer 203 atpoint 321, which is recorded aspressure magnitude 322.Point 321 may be in the same location ondigitizer 203 aspoint 311 or a different location. In the first visual feedback example,cursor halo 324 expands or otherwise changes size and/or shape to indicate the higher pressure to the user. Alternatively, in the second visual feedback example, the color of thebutton 326 darkens (or otherwise changes its appearance) noticeably in response to varying levels of applied pressure. At a third moment in time,digitizer 203 atpoint 331 is further depressed, registering aspressure magnitude 332. Again,point 331 may be the same location aspoints 311 and/or 321, or may be a different location, as will be discussed further. In response to the increased pressure,cursor halo 334 expands further to reflect the increase in pressure, or, in the second example,feedback button 336 darkens considerably. At a fourth and final moment in time,stylus 204 is beginning to lift away fromdigitizer 203 atpoint 341, which registers aspressure magnitude 342. In response to the decreasing pressure,cursor halo 344 begins to deflate in size, or, in the second example,button 346 begins to lighten in color or otherwise alter its appearance in a manner different from when the pressure was increasing. Again,point 341 may be the same location aspoints - The use of pressure feedback may not be desirable if and when
computer 110 is not presently using the pressure information. For example, a first application may utilize pressure information but a second application may not. Thus, where the first application is being executed and/or is in focus, pressure feedback may be provided. However, where the second application is being executed and/or is in focus, then pressure feedback may not be provided. As such, conventional cursors and controls may be used both when pressure is and is not relevant. Feedback, such as the inclusion of a halo around a cursor, may therefore not only provide information on the current level of pressure being applied, but may also relay the fact that pressure is presently relevant. By viewing a cursor halo, or a color varying control, the user may be informed that additional input can be provided through the use of pressure. This selective providing of pressure feedback may allow controllable pressure input to be an easily-discoverable feature even for an amateur user, through simple trial and error. - Calibration may be an integral part of pressure sensitive input. Differing digitizers may translate force differently. As such, an operating system or application which allows for the use of pressure input may provide a calibration routine or pressure settings dialog. This may assist in standardizing input from various types of hardware. Additionally, it may allow weak or disabled individuals to vary the sensitivity of pressure measurements to accommodate their strength.
- Although illustrative embodiments of the use of pressure are set forth in some detail below, other embodiments are available. For example, pressure information may be used to distinguish between which window receives an input. A softer contact may indicate a normal input to the current window, whereas a harder contact may be directed to another application, such as a media player playing in the background. Enabling such uses of pressure enable a power user to work more efficiently by providing a larger input “vocabulary” without needlessly complicating the interface. At the same time, average users who opt not to use pressure or users of devices without pressure sensitivity will not see their experience degrade. These embodiments only enhance the device interaction for those wishing to use them.
- Additional forms of data related to contact with an input device are possible input modifiers. For example, an input device may detect the surface area of contact, the temperature at the point of contact, or dampness at the point of contact. Each may be used to supplement an input and control device behavior.
- For each of the illustrative embodiments set forth below, a device receiving pressure information uses that information to augment the input it receives. This information can be used in a wide variety of ways, including pressure-sensitive controls, pressure-sensitive selection, and through the use of a variable-pressure tap, as will be discussed in detail.
- Controls in a graphical user interface present intuitive input metaphors that allow users to manipulate and maneuver through data. Such controls are common elements found in graphical computer operating systems, individual applications, and other forms of software. Examples of known controls include, but are not limited to, scrollbars, spinner controls, resize handles, checkboxes, pull down menus, and buttons. In accordance with aspects of the present invention, controls may be augmented through the use of pressure data, providing users with an additional way to interact with displayed controls. For example, by making controls pressure-sensitive, users may be presented with a more manipulable user interface that responds faster and more accurately to their commands and that may more closely operate in accordance with the user's intent. Value-based controls, i.e., controls that manipulate an underlying numeric, alphabetic, or alphanumeric value, may be better served by allowing them to be pressure-sensitive. As another example, repeating controls, i.e., controls that repeat the same action when continuously selected, may also be better served by the addition of pressure sensitivity. A few examples of pressure sensitive controls are set forth below.
-
FIG. 4 illustrates movement of avertical scrollbar 403 provided by an illustrative embodiment of the invention.Scrollbar 403 graphically displays and updates an underlying index value indicating a current position within acorresponding document 404.FIG. 4 is divided into two arbitrary “frames” 410, 411 for explanatory purposes only. Theframes portion 410 and ending withportion 411. Infirst frame portion 410 ofFIG. 4 , acursor 401, with acursor halo 402, indicates that the user is presently activating the down arrow button ofvertical scrollbar 403.Scrollbar 403 indicates the vertical placement ofdocument 404 according to the vertical location of athumb control 405. The size ofcursor halo 402 signifies a relatively small amount of pressure being applied. After a period of time, insecond frame 411, the user has continued to activate the down arrow button, and the displayed portion ofdocument 404 has scrolled down (which is actually implemented by movingdocument 404 up), as indicated by the new location ofthumb control 405. Because of the small amount pressure applied by the user, the computer scrollsscrollbar 403 relatively slowly. -
FIG. 5 illustrates the same embodiment depicted inFIG. 4 , and likeFIG. 4 is also divided into two time-lapsedframes frames frames first frame 510,scrollbar 403 is at the same starting point as inframe 410, but this time, the user is pressing harder withstylus 204, as indicated by alarger cursor halo 502. As a result, insecond frame 511,thumb control 405 indicates that the displayed portion ofdocument 404 has scrolled down further than it did inFIG. 4 . Because of the greater amount of pressure applied by the user when activating the down arrow button,computer 110 scrolls scrollbar 403 faster. Thus, in the presented embodiment, the scrolling speed ofscrollbar 403 is dependent upon the amount of pressure applied to the down arrow button ofscrollbar 403. The same may apply to the up arrow button ofscrollbar 403, as well as the left and right arrow buttons of a horizontal scrollbar. The scrolling speed ofscrollbar 403 may have any relationship to the applied pressure, such as a linear relationship or a nonlinear relationship. -
FIG. 6 illustrates incrementing aspinner control 601 provided by an illustrative embodiment of the invention.Spinner control 601 has anentry field 602, where a user can view or change a numeric value, and uparrow button 603 and downarrow button 604, the selection of which respectively increments or decrements the value in the entry field. As inFIGS. 4 and 5 ,FIG. 6 is also divided intoframes first frame 610,entry field 602 contains aninitial value 0. Once a user activates uparrow button 603, the value inentry field 602 begins to increment. Insecond frame 611, this activation is indicated byarrow cursor 605 withcursor halo 606. As indicated by the size ofcursor halo 606, light pressure is being used to manipulate the control, and as such, the value inentry field 602 has increased at a rate and/or increment (e.g., incrementing by 1) that depends upon the applied pressure. Inthird frame 612, the increase continues at a rate and/or increment that depend upon the applied pressure. InFIG. 6 , it is assumed that the same amount of time has elapsed betweenframe frames -
FIG. 7 illustrates the same embodiment depicted inFIG. 6 , and likeFIG. 6 is also divided into three time-lapsedframes frames frames frames frames first frame 710,spinner control 601 has been reset to the same initial value as inframe 610. However, insecond frame 711, when greater pressure is applied to activate uparrow button 603, the value inentry field 602 is updated using a larger increment and/or at a greater rate than inFIG. 6 . The higher pressure is indicated by thelarger cursor halo 706. In third andfinal frame 712, the higher pressure continues updating the value using the larger increment and/or at a greater rate. Thus, the value of a spinner control may increase or decrease by an increment and/or at a rate that depends linearly or nonlinearly upon the applied pressure. - Additionally or alternatively, with value-based controls, such as the spinner and scrollbar controls, pressure sensitivity may be used to control the rate of adjustment of a value or index. Although in the above examples, the pressure applied remained constant, that does not need to be the case. Indeed, a user may continuously and dynamically increase and decrease the pressure applied when providing an input in order to affect the operation of the control. As the pressure changes, so does the affect on the control, such as the rate of change. Moreover, the value need not be numeric, and could be chosen from any defined set, such as a set of characters, words, colors, patterns, and so forth, depending on the particular need. For example, a color selection can be made by using pressure to speed and slow the rate of change among a selection of colors and/or to control the intensity or other property of a color. Another embodiment may involve selecting a state in a pull down menu and using pressure to speed and slow the movement of the list.
- Pressure sensitivity may further be used to constrain the change of a value, as depicted for example in
FIGS. 8 and 9 .FIG. 8 illustrates resizing adrawing object 801 using an illustrative embodiment of the invention. The context here may be a drawing application or other program that allows users to resize objects. InFIG. 8 , drawingobject 801 is being resized by a user.Arrow cursor 803 havingcursor halo 804 is positioned over resize handle 802 at one corner ofobject 801. Light pressure is applied by the user, as indicated by thesmall cursor halo 804. As such, the application allows the user to resize following a smooth, freeform path. At the end of the path, resizehandle 802 has been repositioned atlocation 805. -
FIG. 9 illustrates the same embodiment of the invention depicted inFIG. 8 . The difference here is that the user is applying more pressure, as indicated bylarger cursor halo 904. Because of this greater pressure, the application may cause the movement of resize handle 802 to be non-smooth, such as by constraining it to fixed positions on a predetermined grid, for example. Here, resize handle 802 is relocated atlocation 905. Along the way, resize handle 802 is limited to locations along a fixed grid, which may or may not be visible to the user. Alternatively, the amount of pressure applied may affect the grid increment by which object 801 can be resized. In addition, an operating system or application may reverse the relationship, and constrain resize movement to a grid when less pressure is applied and allowing freeform resizing only when more pressure is applied. - The pressure-sensitive control (i.e., resize handle 802) disclosed in
FIGS. 8 and 9 is not necessarily limited to resizing objects. It may apply to other controls such as for moving objects, for example. It may also be used for resizing or repositioning windows in an operating system or application. Ultimately, the technique can be used in any instance where values defining an object's screen position, shape, size, and/or any other property are being modified. Furthermore, other forms of pressure-sensitive controls may be implemented. For example, a button or other control that performs an action when activated and that repeats the action when continuously activated may be pressure-sensitive. Such a repeating control may repeat the action faster when greater pressure is applied to the input, and likewise slow down the rate of repetition when less pressure is applied. An additional implementation involves the use of harder pressure to vary the behavior of a drag operation. For example, dragging a drawing object while pressing harder may cause the object to paste in repeated fashion, similar to a rubber stamp effect; the harder the drag, the more frequent the repeating stamps. -
FIG. 10 illustrates a method for adjusting a displayed control provided by an illustrative embodiment of the invention. Instep 1001,computer 110 displays a pressure sensitive control ondisplay 202, for example a scrollbar, resize handle, button, spinner control, and so forth. When a user directs input to the control, coupled with pressure applied by the user withstylus 204,computer 110 receives the input, as instep 1002. Atstep 1003,computer 110 determines the amount of pressure applied by the user. Alternatively,computer 110 may, at this point, compare the pressure applied to an amount applied previously in order to determine its relative magnitude. Atstep 1004,computer 110 moves or modifies the respective control (including adjusting any underlying value) in accordance with the amount of pressure applied. A greater amount of pressure from the user may cause adjustments to the control carried out in a first manner, perhaps by speeding up or using larger increments, whereas less pressure may cause adjustments to occur in a second manner, such as slower adjustments, or even cause the control to behave as if not pressure-sensitive. Atdecision 1005,computer 110 determines whether the user input continues, and if so, performs steps 1002-1005 again. Otherwise, the method terminates and/or awaits another user input. - Item selection is a common activity for users of graphical computer operating systems, applications, and other software, and activity that can benefit from pressure sensitivity. Selecting choices from a list, words in a document, files in a folder, etc. are tasks with which most users are familiar. Pressure sensitivity enhances the process of item selection, for example when the use of double or triple mouse clicks are needed to broaden a selection but no mouse is available, such as with a tablet computer. For example, by pressing harder, a user signals that he wants to select a larger number of items. A user need not attempt a notoriously difficult double or triple click with a stylus on a tablet computer.
-
FIG. 11 illustrates selecting text in a word processing application in a manner provided by an illustrative embodiment of the invention. As in several of the previous figures,FIG. 11 is divided into threeframes Frames first frame 1100, aword 1102 inparagraph 1101 is being selected bycursor 1103 havingcursor halo 1104. A user applies a small amount of pressure to selectword 1101. Alternatively, with a small amount of pressure, the user may simply place an insertion point in the middle ofword 1102. As the user presses harder, she begins to select more text, as shown insecond frame 1110, wherein more ofparagraph 1101, such as a line or sentence, is selected in the form ofsentence 1111. The greater pressure is reflected as visual feedback in the form oflarger cursor halo 1114. Inthird frame 1120, the user presses harder still, reflected inlarger cursor halo 1124, which selects theentire paragraph 1101. Alternatively, pressing even harder may select a whole document or document section. Conversely, decreasing the selection pressure may deselect a paragraph and select only a word or sentence again. Thus, different levels of applied pressure on a displayed document may cause different amounts of the document to be selected. -
FIG. 12 illustrates selecting drawing objects in a drawing software program in a manner provided by an illustrative embodiment of the invention. As inFIG. 11 ,FIG. 12 is divided into threeframes first frame 1200, drawingobject 1201 is selected by a user using a small amount of pressure, as indicated bycursor 1202 having asmall cursor halo 1203. Selectedobject 1201 may be surrounded by a selection tool embodied as aselection border 1204. Each drawing object within theselection border 1204 is part of the selection. In this example, the size of the selection border 1204 (i.e., the area encompassed by the selection border 1204) depends upon the amount of applied pressure. As indicated by the size ofcursor halo 1203, the user is pressing lightly in order to select the object currently undercursor 1202. As the user presses harder, insecond frame 1210,selection border 1204 grows in accordance with the higher applied pressure, and in this case grows sufficiently large so as to encompass more objects, including, for example,object 1215.Cursor halo 1213 reflects the increasing pressure applied by the user. As the user presses harder still, in third andfinal frame 1220,selection border 1204 grows larger still, encompassing more drawing objects in this case including, for example,object 1226.Selection border 1204 may be constrained to grow to encompass only objects connected or adjacent to the originally-selected object, or may be configured to also encompass objects not connected or adjacent to the originally-selected object. As before, reducing the pressure applied may return the selection to a smaller number of objects. - Alternative forms of item selection are known which may be enhanced through the use of pressure based selection. For example, U.S. Patent Application Publication No. 20040021701 A1, entitled “Freeform Encounter Selection Tool,” which is hereby incorporated by reference as to its disclosure of an encounter selection tool, discloses a freeform encounter selection tool for a computer system with a graphical user interface that allows a user to draw a freeform selection path so as to select one or more items along or near the path. As the user drags a pointing device such as a stylus or a mouse, a freeform selection path is created so that the encounter selection tool selects graphical items that are encountered.
-
FIG. 13 illustrates using an encounter selection tool to select file and folder objects by an illustrative embodiment of the invention. Here, a subset of acollection 1301 of files and folders is selected by draggingcursor 1302, havingcursor halo 1303, fromstart point 1304 toend point 1306. Folders and files encountered along the path of the cursor, forexample folder 1305, are selected and highlighted as shown. The user may then perform a common function among all of the files, such as throwing them away. As can be seen from the cursor halo, the user is only pressing lightly when he uses the encounter select tool. This leads to a relatively narrow selection path. In this embodiment, the lighter the pressure, the narrower the selection path, and thus in general the fewer the number of objects selected along the selection path. -
FIG. 14 illustrates thesame collection 1301 of files and folders presented inFIG. 13 . Here, however, the user presses harder while movingcursor 1302 fromstart point 1304 toend point 1306, as reflected inlarger cursor halo 1403. The result of the increased pressure is that a wider selection path is created and a larger number of objects is selected, including, for example,document 1405. Although this embodiment has been discussed with regard to a freeform selection path that follows movement of thecursor 1306, the selection path may take other forms such as a linear path that extends between thestart point 1304 and theend point 1306, regardless of the path thecursor 1306 takes between the start andend points -
FIG. 15 illustrates one method for selecting displayed objects provided by an illustrative embodiment of the invention. Instep 1501, a collection of selectable items such as characters, drawing objects, file icons, etc. is displayed. A user then selects at least one of the items withstylus 204, and the stylus input is received instep 1502. Atstep 1503, the amount of pressure applied by the user with thestylus 204 is determined so that, instep 1504, the identities of which items are selected can be modified. For example, when a user presses harder during a selection operation, the selection path widens, and so more of the items may be selected. If there is more input from the user, atdecision 1505, the steps continue. Otherwise, the method terminates normally or awaits further user input. - The pressure-based selection embodiments presented above are only representative, and other forms of selection may be enhanced through the inclusion of pressure information. For example, a lasso selection tool, which is familiar from its use in graphics editing software, may be enhanced with pressure information. In such software, lasso selection allows a user to encircle in freeform fashion a graphic of interest and cut or copy it. A user, by pressing harder while encircling a selected graphic of interest, may control the amount of feathering used to soften the edges of the graphic of interest when it is cut or copied, or whether to select objects only partially within the lasso (e.g., lower pressure does not select such objects while higher pressure does select such objects). Additionally, pressure based selection may allow for a selection zoom. For example, while pressing harder on a pixel to select it, the screen may zoom in further allowing greater detail to be displayed. The user can then achieve more precision while selecting neighboring pixels.
- Tapping a digitizer is a common form of user interaction with a computing device, such as a tablet computer, which can be enhanced by exploiting pressure information available from the digitizer. The term tap includes the contact and removal of a stylus, such as a pen, finger, or any other pointing implement, with the surface of the digitizer. Typically, a tap is interpreted as equivalent to a mouse click, regardless of how much force was used to impact the digitizer. Pressure information can be used, however, to distinguish normal taps from hard taps, enabling new sources of user input. For example, a tap with an applied pressure within a given first pressure range may be considered a normal tap, whereas a tap with an applied pressure within a given higher second pressure range may be considered a hard tap. Any number of pressure ranges may be defined with an associated tap type. A normal tap may, for example, be interpreted as a simple click, and a hard tap may, for example, be used to trigger additional functionality. For example, a hard tap may be interpreted as a double-tap (notoriously difficult on digitizer displays), or as a right click, or as a trigger for an on-screen keyboard for tapping out words and sentences, or as a request to launch an application, or as a middle click (on a three button mouse), or as a scroll wheel click, and so forth.
-
FIGS. 16 and 17 illustrate one embodiment of hard taps. As in some previous figures,FIGS. 16 and 17 are each divided into twoarbitrary frames FIG. 16 depicts movement of a scrollbar under conditions of a normal tap provided by an illustrative embodiment of the invention. Here, infirst frame 1610, a user taps onscrollbar 1603, the tap being indicated by the temporary placement ofcursor 1601 andstarburst halo 1602. Thestarburst halo 1602 may signify to the user that a tap is being received as opposed to a press-and-hold. Thesmall starburst halo 1602 indicates, in this case, that the tap was not very hard. Insecond frame 1611, the results of the tap are viewable.Document 1604 has scrolled down one page, and thethumb control 1605 has shifted down. -
First frame 1710 ofFIG. 17 illustrates the same starting position ofdocument 1604 andthumb control 1605. A user taps the same location onscrollbar 1603 as before, but this time taps harder. Thestarburst halo 1702 that temporarily appears is larger, indicating that a harder tap was registered than inFIG. 16 . Rather than page down as before, the hard tap in this case triggers a different function. As can be seen insecond frame 1711,thumb control 1605 jumps directly to the location of the harder tap. This can be useful for a user who wants to go directly to a portion ofdocument 1604 without waiting for the scrollbar to page down. -
FIGS. 18 and 19 illustrate a second embodiment of hard and normal taps. As before, the figures are each divided into twoarbitrary frames first frame 1810 ofFIG. 18 , afile 1801 receives a single normal tap, as signified bycursor 1802 withstarburst halo 1803. As a result of the lower pressure applied, insecond frame 1811,computer 110 interprets the normal tap as a left click andfile 1804 is highlighted. Thefirst frame 1910 ofFIG. 19 differs from that ofFIG. 18 in thatfile 1801 receives a hard tap, as signified bylarger starburst halo 1903. As a result,computer 110 performs a separate action insecond frame 1911, treating the hard tap as a right click, and displaying the context sensitive menu rather than selecting the file. - As stated, the variable-pressure tap embodiments described above are only a few of the uses of such an enhancement. In addition, there are many embodiments for providing feedback about the type of tap being received. The starburst halo described above is merely demonstrative. For example, other forms of visual feedback include changing the color or transparency of the point of impact rather than changing the cursor. In addition, audio feedback may distinguish a hard tap from another tap or input. The existence of a particular sound, or the volume of a sound, or a particular pitch may provide the feedback needed by a user to distinguish tap types. The methods for evaluating a hard tap and distinguishing hard taps from other forms of input are set forth in some detail below.
- Distinguishing a hard tap from other user inputs on a digitizer may involve determining whether the tip of
stylus 204 maintains a substantially constant position on the surface ofdigitizer 203. A tap that moves across the surface ofdigitizer 203 is more likely an intended drag operation rather than a tap, and so a distance threshold may be used to ensure that the point of contact does not move too far. Such a threshold is depicted inFIG. 20 , which provides an illustrative layout for a distance threshold on an X-Y coordinate plane defining locations on the surface ofdigitizer 203. Here, the initial point of contact is represented by the black unit square in the middle of the figure. Unit squares inFIG. 18 are shaded for explanatory purposes only, and are not necessarily displayed as shown, if at all, to the user. Each unit square (or other shaped area) may represent a pixel on the underlying display and/or a minimum resolvable area that may be sensed bydigitizer 203, or any other unit of area, whether arbitrary or whether defined in accordance with properties ofdisplay 202 and/ordigitizer 203. The immediately adjacent squares or pixels (cross-hatched inFIG. 18 ) form a distance threshold. By way of example,stylus 204 may initially impact the pressure-sensitive surface ofdigitizer 203 at the black square, andstylus 204 may immediately thereafter slide just a little bit. However, if, over time,stylus 204 moves outside the distance threshold (i.e., in this example, moves outside the cross-hatched unit squares), then a hard tap is not registered bycomputer 110. This distance threshold is configurable. If a user has a difficult time holding a pointing implement steady, she may be able to adjust the distance threshold, such as by increasing the distance threshold to include a larger range of acceptable contact points. - If
stylus 204 stays inside the distance threshold for an appropriate period of time,computer 110 still may not register a hard tap. In general,computer 110 may determine whether a tap input bystylus 204 is a hard tap depending upon applied pressure, hold time, and/or slide distance ofstylus 204. In this example, for the tap to be a hard tap, the tap input must reach an appropriate pressure threshold within a particular time threshold.FIG. 21 depicts agraph 2101 of input pressure over time not resulting in a hard tap as provided by an illustrative embodiment of the invention. Here, the tap has stayed within the appropriate distance threshold, but by the time the contact passes time threshold 2102 (perhaps set at ¼ of a second or another amount of time) atpoint 2104, the magnitude of pressure has not passedpressure threshold 2103. As a result,computer 110 may interpret the input as a normal tap or some other type of input. -
FIG. 22 depicts agraph 2201 of input pressure over time resulting in a hard tap as provided by an illustrative embodiment of the invention. Here, the contact being graphed has surpassed apressure threshold 2203 at apoint 2204 before reaching atime threshold 2202. This may therefore be registered as a hard tap, and the operating system or application can process it as such. If the operating system or application does not take advantage of hard taps, then the tap can be interpreted as a normal tap. - As another example,
FIG. 23 depicts agraph 2301 of input pressure over time in which the tap input does not register as a hard tap as provided by an illustrative embodiment of the invention. In this example, the contact with the surface ofdigitizer 203 eventually exceeds apressure threshold 2303, but not within atime threshold 2302. When the time of contact exceedstime threshold 2302 at apoint 2304, similar to the example inFIG. 21 , the input can be passed on as something other than a hard tap, such as a normal tap. - It should be noted that the time and pressure thresholds used to detect a hard tap may be user- or software-configurable. Individual users can perfect their successful use of the hard tap by adjusting the pressure threshold and time threshold. For example, a particular user may not be able to achieve the pressure magnitude needed, and can adjust the pressure and/or time thresholds as desired, such as to adjust the pressure threshold lower to allow for “softer” hard taps. Pressure and/or time thresholds may further be automatically adjusted through a calibration routine. For example,
computer 110, via a user interface, may request that the user execute what the user considers to be a normal tap as well as a hard tap.Computer 100 may measure the pressure and time properties of the user's input and automatically determine appropriate time and/or pressure thresholds in accordance with those properties. - Alternative embodiments of the hard tap may allow for additional variations of the tap using multiple time, distance, and/or pressure thresholds. For example,
FIG. 24 depicts a set of pressure ranges over time as provided by an illustrative embodiment of the invention. Here, taps not exceeding apressure threshold 2401 withintime threshold 2403 will be considered normal taps. Taps exceeding apressure threshold 2401, but not exceeding ahigher pressure threshold 2402 withintime threshold 2403 will be interpreted as medium taps. And, taps exceedingpressure threshold 2402 will be interpreted as hard taps. Medium taps may be useful in some interface environments. Individual applications may provide different thresholds depending on their need, overriding thresholds set by the operating system. -
FIG. 25 us a flowchart for a method for responding to a user interaction provided by an illustrative embodiment of the invention. Instep 2501,computer 110 receives a stylus input upondigitizer 203, providing data about the tap including location and pressure over time. Given this information, atdecision 2502,computer 110 determines whether the contact location has moved within a predetermined threshold distance from its initial point. If not, then no tap is found, as the input may be the beginning of a drag operation, or possibly handwriting. Alternatively,computer 110 may decide that such an input is a normal tap. Ifstylus 204 remains within the threshold distance during the input, then atdecision 2503,computer 110 determines whether the applied pressure exceeds a predetermined threshold pressure within a predetermined time threshold. If not, then a normal tap is detected and the appropriate function is executed atstep 2505. If the threshold pressure was reached within the time threshold, then a hard tap is detected and the appropriate function is executed atstep 2504. - While the invention has been described with respect to specific examples including presently preferred modes of carrying out the invention, those skilled in the art will appreciate that there are numerous variations and permutations of the above described devices and techniques that fall within the spirit and scope of the invention as set forth in the appended claims. A claim element should not be interpreted as being in means-plus-function format unless the phrase “means for”, “step for”, or “steps for” is included in that element. Also, numerically-labeled steps in method claims are for labeling purposes only and should not be interpreted as requiring a particular ordering of steps.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/619,373 US20100060606A1 (en) | 2004-12-21 | 2009-11-16 | Hard tap |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/017,073 US7629966B2 (en) | 2004-12-21 | 2004-12-21 | Hard tap |
US12/619,373 US20100060606A1 (en) | 2004-12-21 | 2009-11-16 | Hard tap |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/017,073 Continuation US7629966B2 (en) | 2004-12-21 | 2004-12-21 | Hard tap |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100060606A1 true US20100060606A1 (en) | 2010-03-11 |
Family
ID=36595056
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/017,073 Expired - Fee Related US7629966B2 (en) | 2004-12-21 | 2004-12-21 | Hard tap |
US12/619,373 Abandoned US20100060606A1 (en) | 2004-12-21 | 2009-11-16 | Hard tap |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/017,073 Expired - Fee Related US7629966B2 (en) | 2004-12-21 | 2004-12-21 | Hard tap |
Country Status (1)
Country | Link |
---|---|
US (2) | US7629966B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130044062A1 (en) * | 2011-08-16 | 2013-02-21 | Nokia Corporation | Method and apparatus for translating between force inputs and temporal inputs |
US9442572B2 (en) | 2013-03-15 | 2016-09-13 | Peter James Tooch | 5-key data entry system and accompanying interface |
Families Citing this family (193)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI269214B (en) * | 2005-06-08 | 2006-12-21 | Elan Microelectronics Corp | Object-detecting method of capacitive touch panel |
DE102006029506B4 (en) * | 2005-10-28 | 2018-10-11 | Volkswagen Ag | input device |
US20080082928A1 (en) * | 2006-09-29 | 2008-04-03 | Sbc Knowledge Ventures, L.P. | Method for viewing information in a communication device |
US8665225B2 (en) * | 2007-01-07 | 2014-03-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interpreting a finger gesture |
US20100127983A1 (en) * | 2007-04-26 | 2010-05-27 | Pourang Irani | Pressure Augmented Mouse |
KR100930563B1 (en) * | 2007-11-06 | 2009-12-09 | 엘지전자 주식회사 | Mobile terminal and method of switching broadcast channel or broadcast channel list of mobile terminal |
US8634645B2 (en) * | 2008-03-28 | 2014-01-21 | Smart Technologies Ulc | Method and tool for recognizing a hand-drawn table |
US8600164B2 (en) * | 2008-03-28 | 2013-12-03 | Smart Technologies Ulc | Method and tool for recognizing a hand-drawn table |
US8497836B2 (en) | 2008-05-06 | 2013-07-30 | Cisco Technology, Inc. | Identifying user by measuring pressure of button presses on user input device |
US8296670B2 (en) * | 2008-05-19 | 2012-10-23 | Microsoft Corporation | Accessing a menu utilizing a drag-operation |
US8331685B2 (en) | 2008-05-28 | 2012-12-11 | Apple Inc. | Defining a border for an image |
US8154524B2 (en) * | 2008-06-24 | 2012-04-10 | Microsoft Corporation | Physics simulation-based interaction for surface computing |
KR101495559B1 (en) * | 2008-07-21 | 2015-02-27 | 삼성전자주식회사 | User command input method and apparatus |
JP4636141B2 (en) * | 2008-08-28 | 2011-02-23 | ソニー株式会社 | Information processing apparatus and method, and program |
TW201013475A (en) * | 2008-09-16 | 2010-04-01 | Ideacom Technology Corp | Cursor control apparatus and the method therein |
JP5083150B2 (en) * | 2008-09-30 | 2012-11-28 | カシオ計算機株式会社 | Image processing apparatus, processing order setting method thereof, and processing order setting program |
KR101586627B1 (en) * | 2008-10-06 | 2016-01-19 | 삼성전자주식회사 | A method for controlling of list with multi touch and apparatus thereof |
KR101569176B1 (en) * | 2008-10-30 | 2015-11-20 | 삼성전자주식회사 | Method and Apparatus for executing an object |
US20100123686A1 (en) * | 2008-11-19 | 2010-05-20 | Sony Ericsson Mobile Communications Ab | Piezoresistive force sensor integrated in a display |
US8423916B2 (en) * | 2008-11-20 | 2013-04-16 | Canon Kabushiki Kaisha | Information processing apparatus, processing method thereof, and computer-readable storage medium |
US8030914B2 (en) * | 2008-12-29 | 2011-10-04 | Motorola Mobility, Inc. | Portable electronic device having self-calibrating proximity sensors |
US8275412B2 (en) * | 2008-12-31 | 2012-09-25 | Motorola Mobility Llc | Portable electronic device having directional proximity sensors based on device orientation |
JP5470861B2 (en) * | 2009-01-09 | 2014-04-16 | ソニー株式会社 | Display device and display method |
KR101544364B1 (en) | 2009-01-23 | 2015-08-17 | 삼성전자주식회사 | Mobile terminal having dual touch screen and method for controlling contents thereof |
US8633901B2 (en) * | 2009-01-30 | 2014-01-21 | Blackberry Limited | Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device |
JP5734546B2 (en) * | 2009-02-25 | 2015-06-17 | 京セラ株式会社 | Object display device |
US8510665B2 (en) * | 2009-03-16 | 2013-08-13 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
US20100271312A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Menu Configuration System and Method for Display on an Electronic Device |
US20100271331A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Touch-Screen and Method for an Electronic Device |
US8391719B2 (en) * | 2009-05-22 | 2013-03-05 | Motorola Mobility Llc | Method and system for conducting communication between mobile devices |
US8788676B2 (en) * | 2009-05-22 | 2014-07-22 | Motorola Mobility Llc | Method and system for controlling data transmission to or from a mobile device |
US8619029B2 (en) * | 2009-05-22 | 2013-12-31 | Motorola Mobility Llc | Electronic device with sensing assembly and method for interpreting consecutive gestures |
US8304733B2 (en) * | 2009-05-22 | 2012-11-06 | Motorola Mobility Llc | Sensing assembly for mobile device |
US8542186B2 (en) * | 2009-05-22 | 2013-09-24 | Motorola Mobility Llc | Mobile device with user interaction capability and method of operating same |
US8344325B2 (en) * | 2009-05-22 | 2013-01-01 | Motorola Mobility Llc | Electronic device with sensing assembly and method for detecting basic gestures |
US8294105B2 (en) * | 2009-05-22 | 2012-10-23 | Motorola Mobility Llc | Electronic device with sensing assembly and method for interpreting offset gestures |
US8269175B2 (en) * | 2009-05-22 | 2012-09-18 | Motorola Mobility Llc | Electronic device with sensing assembly and method for detecting gestures of geometric shapes |
US8319170B2 (en) | 2009-07-10 | 2012-11-27 | Motorola Mobility Llc | Method for adapting a pulse power mode of a proximity sensor |
JP5310389B2 (en) * | 2009-08-27 | 2013-10-09 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JP2011048669A (en) * | 2009-08-27 | 2011-03-10 | Kyocera Corp | Input device |
JP5304544B2 (en) * | 2009-08-28 | 2013-10-02 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JP5267388B2 (en) * | 2009-08-31 | 2013-08-21 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JP5310403B2 (en) * | 2009-09-02 | 2013-10-09 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JP5182260B2 (en) * | 2009-09-02 | 2013-04-17 | ソニー株式会社 | Operation control device, operation control method, and computer program |
WO2011058735A1 (en) | 2009-11-12 | 2011-05-19 | 京セラ株式会社 | Portable terminal, input control program and input control method |
JP5325747B2 (en) * | 2009-11-12 | 2013-10-23 | 京セラ株式会社 | Portable terminal and input control program |
US8665227B2 (en) | 2009-11-19 | 2014-03-04 | Motorola Mobility Llc | Method and apparatus for replicating physical key function with soft keys in an electronic device |
TW201128478A (en) * | 2010-02-12 | 2011-08-16 | Novatek Microelectronics Corp | Touch sensing method and system using the same |
US9007190B2 (en) | 2010-03-31 | 2015-04-14 | Tk Holdings Inc. | Steering wheel sensors |
US8587422B2 (en) | 2010-03-31 | 2013-11-19 | Tk Holdings, Inc. | Occupant sensing system |
DE102011006649B4 (en) | 2010-04-02 | 2018-05-03 | Tk Holdings Inc. | Steering wheel with hand sensors |
US20110248931A1 (en) * | 2010-04-08 | 2011-10-13 | Research In Motion Limited | Tactile feedback for touch-sensitive display |
US8963845B2 (en) | 2010-05-05 | 2015-02-24 | Google Technology Holdings LLC | Mobile device with temperature sensing capability and method of operating same |
US8466889B2 (en) | 2010-05-14 | 2013-06-18 | Research In Motion Limited | Method of providing tactile feedback and electronic device |
EP2386935B1 (en) * | 2010-05-14 | 2015-02-11 | BlackBerry Limited | Method of providing tactile feedback and electronic device |
US8751056B2 (en) | 2010-05-25 | 2014-06-10 | Motorola Mobility Llc | User computer device with temperature sensing capabilities and method of operating same |
US9103732B2 (en) | 2010-05-25 | 2015-08-11 | Google Technology Holdings LLC | User computer device with temperature sensing capabilities and method of operating same |
US9542091B2 (en) | 2010-06-04 | 2017-01-10 | Apple Inc. | Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator |
US9329767B1 (en) * | 2010-06-08 | 2016-05-03 | Google Inc. | User-specific customization based on characteristics of user-interaction |
JP5573487B2 (en) * | 2010-08-20 | 2014-08-20 | ソニー株式会社 | Information processing apparatus, program, and operation control method |
JP5732783B2 (en) * | 2010-09-02 | 2015-06-10 | ソニー株式会社 | Information processing apparatus, input control method for information processing apparatus, and program |
US20140368455A1 (en) * | 2011-03-15 | 2014-12-18 | Logitech Europe Sa | Control method for a function of a touchpad |
JP5722696B2 (en) | 2011-05-10 | 2015-05-27 | 京セラ株式会社 | Electronic device, control method, and control program |
US9244605B2 (en) | 2011-05-31 | 2016-01-26 | Apple Inc. | Devices, methods, and graphical user interfaces for document manipulation |
US8587542B2 (en) * | 2011-06-01 | 2013-11-19 | Motorola Mobility Llc | Using pressure differences with a touch-sensitive display screen |
US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
US20140204063A1 (en) * | 2011-09-05 | 2014-07-24 | Nec Casio Mobile Communications, Ltd. | Portable Terminal Apparatus, Portable Terminal Control Method, And Program |
US8976128B2 (en) | 2011-09-12 | 2015-03-10 | Google Technology Holdings LLC | Using pressure differences with a touch-sensitive display screen |
US9069460B2 (en) | 2011-09-12 | 2015-06-30 | Google Technology Holdings LLC | Using pressure differences with a touch-sensitive display screen |
US9002322B2 (en) | 2011-09-29 | 2015-04-07 | Apple Inc. | Authentication with secondary approver |
US8769624B2 (en) | 2011-09-29 | 2014-07-01 | Apple Inc. | Access control utilizing indirect authentication |
US9063591B2 (en) | 2011-11-30 | 2015-06-23 | Google Technology Holdings LLC | Active styluses for interacting with a mobile device |
US8963885B2 (en) | 2011-11-30 | 2015-02-24 | Google Technology Holdings LLC | Mobile device for interacting with an active stylus |
US9411423B2 (en) * | 2012-02-08 | 2016-08-09 | Immersion Corporation | Method and apparatus for haptic flex gesturing |
EP2629186A1 (en) * | 2012-02-15 | 2013-08-21 | Siemens Aktiengesellschaft | Hand-held control device for controlling an industrial device and method for altering a parameter |
US9870066B2 (en) | 2012-03-02 | 2018-01-16 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
US9460029B2 (en) | 2012-03-02 | 2016-10-04 | Microsoft Technology Licensing, Llc | Pressure sensitive keys |
US9075566B2 (en) | 2012-03-02 | 2015-07-07 | Microsoft Technoogy Licensing, LLC | Flexible hinge spine |
US9298236B2 (en) | 2012-03-02 | 2016-03-29 | Microsoft Technology Licensing, Llc | Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter |
US9064654B2 (en) | 2012-03-02 | 2015-06-23 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
US9706089B2 (en) | 2012-03-02 | 2017-07-11 | Microsoft Technology Licensing, Llc | Shifted lens camera for mobile computing devices |
US20130234959A1 (en) * | 2012-03-06 | 2013-09-12 | Industry-University Cooperation Foundation Hanyang University | System and method for linking and controlling terminals |
WO2013154720A1 (en) | 2012-04-13 | 2013-10-17 | Tk Holdings Inc. | Pressure sensor including a pressure sensitive material for use with control systems and methods of using the same |
WO2013169875A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
WO2013169865A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
AU2013259637B2 (en) | 2012-05-09 | 2016-07-07 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
JP6273263B2 (en) | 2012-05-09 | 2018-01-31 | アップル インコーポレイテッド | Device, method, and graphical user interface for displaying additional information in response to user contact |
CN104471521B (en) | 2012-05-09 | 2018-10-23 | 苹果公司 | For providing the equipment, method and graphic user interface of feedback for the state of activation for changing user interface object |
JP6002836B2 (en) | 2012-05-09 | 2016-10-05 | アップル インコーポレイテッド | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
WO2013169843A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for manipulating framed graphical objects |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
EP2847661A2 (en) * | 2012-05-09 | 2015-03-18 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
JP6082458B2 (en) * | 2012-05-09 | 2017-02-15 | アップル インコーポレイテッド | Device, method, and graphical user interface for providing tactile feedback of actions performed within a user interface |
WO2013169851A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for facilitating user interaction with controls in a user interface |
WO2013169845A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for scrolling nested regions |
US20130300590A1 (en) | 2012-05-14 | 2013-11-14 | Paul Henry Dietz | Audio Feedback |
US10031556B2 (en) | 2012-06-08 | 2018-07-24 | Microsoft Technology Licensing, Llc | User experience adaptation |
WO2014018121A1 (en) | 2012-07-26 | 2014-01-30 | Changello Enterprise Llc | Fingerprint-assisted force estimation |
WO2014018116A1 (en) | 2012-07-26 | 2014-01-30 | Changello Enterprise Llc | Ultrasound-based force sensing and touch sensing |
WO2014018115A1 (en) | 2012-07-26 | 2014-01-30 | Changello Enterprise Llc | Ultrasound-based force sensing of inputs |
WO2014035479A2 (en) | 2012-08-30 | 2014-03-06 | Changello Enterprise Llc | Auto-baseline determination for force sensing |
WO2014043664A1 (en) | 2012-09-17 | 2014-03-20 | Tk Holdings Inc. | Single layer force sensor |
US10275137B2 (en) * | 2012-11-05 | 2019-04-30 | Trane International | Method of displaying incrementing or decrementing number to simulate fast acceleration |
WO2014105279A1 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for switching between user interfaces |
AU2013368445B8 (en) | 2012-12-29 | 2017-02-09 | Apple Inc. | Device, method, and graphical user interface for determining whether to scroll or select contents |
CN109375853A (en) | 2012-12-29 | 2019-02-22 | 苹果公司 | To equipment, method and the graphic user interface of the navigation of user interface hierarchical structure |
AU2013368443B2 (en) | 2012-12-29 | 2016-03-24 | Apple Inc. | Device, method, and graphical user interface for transitioning between touch input to display output relationships |
WO2014105277A2 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
KR20170081744A (en) | 2012-12-29 | 2017-07-12 | 애플 인크. | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
US20140201657A1 (en) * | 2013-01-15 | 2014-07-17 | Motorola Mobility Llc | Method and apparatus for receiving input of varying levels of complexity to perform actions having different sensitivities |
CN103092483B (en) * | 2013-02-05 | 2019-05-24 | 华为终端有限公司 | Touch operation method and mobile terminal in user interface |
WO2014143776A2 (en) | 2013-03-15 | 2014-09-18 | Bodhi Technology Ventures Llc | Providing remote interactions with host device using a wireless device |
KR20140114766A (en) * | 2013-03-19 | 2014-09-29 | 퀵소 코 | Method and device for sensing touch inputs |
US9612689B2 (en) | 2015-02-02 | 2017-04-04 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers and activating a function in the selected interaction layer |
US9013452B2 (en) | 2013-03-25 | 2015-04-21 | Qeexo, Co. | Method and system for activating different interactive functions using different types of finger contacts |
US9304549B2 (en) | 2013-03-28 | 2016-04-05 | Microsoft Technology Licensing, Llc | Hinge mechanism for rotatable component attachment |
JP5939581B2 (en) | 2013-07-08 | 2016-06-22 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | Method for moving object to be moved displayed on display screen, electronic device and electronic device program thereof |
US20150022460A1 (en) * | 2013-07-17 | 2015-01-22 | Lenovo (Singapore) Pte. Ltd. | Input character capture on touch surface using cholesteric display |
US20150052430A1 (en) * | 2013-08-13 | 2015-02-19 | Dropbox, Inc. | Gestures for selecting a subset of content items |
US9665206B1 (en) | 2013-09-18 | 2017-05-30 | Apple Inc. | Dynamic user interface adaptable to multiple input tools |
US9690389B2 (en) * | 2013-09-19 | 2017-06-27 | Dell Products L.P. | Force sensing keyboard with automatic adjustment of actuation force base on user typing style |
FR3015381B1 (en) * | 2013-12-19 | 2016-01-29 | Dav | CONTROL DEVICE FOR MOTOR VEHICLE AND CONTROL METHOD |
TWI499944B (en) * | 2014-01-17 | 2015-09-11 | Egalax Empia Technology Inc | Active stylus with switching function |
US9582145B2 (en) | 2014-01-27 | 2017-02-28 | Groupon, Inc. | Learning user interface |
US10089346B2 (en) | 2014-04-25 | 2018-10-02 | Dropbox, Inc. | Techniques for collapsing views of content items in a graphical user interface |
US9891794B2 (en) | 2014-04-25 | 2018-02-13 | Dropbox, Inc. | Browsing and selecting content items based on user gestures |
US11343335B2 (en) | 2014-05-29 | 2022-05-24 | Apple Inc. | Message processing by subscriber app prior to message forwarding |
US9967401B2 (en) | 2014-05-30 | 2018-05-08 | Apple Inc. | User interface for phone call routing among devices |
US9898162B2 (en) | 2014-05-30 | 2018-02-20 | Apple Inc. | Swiping functions for messaging applications |
EP3149554B1 (en) | 2014-05-30 | 2024-05-01 | Apple Inc. | Continuity |
US9971500B2 (en) | 2014-06-01 | 2018-05-15 | Apple Inc. | Displaying options, assigning notification, ignoring messages, and simultaneous user interface displays in a messaging application |
EP3147747A1 (en) | 2014-06-27 | 2017-03-29 | Apple Inc. | Manipulation of calendar application in device with touch screen |
WO2016014601A2 (en) | 2014-07-21 | 2016-01-28 | Apple Inc. | Remote user interface |
WO2016022205A1 (en) | 2014-08-02 | 2016-02-11 | Apple Inc. | Context-specific user interfaces |
US10339293B2 (en) | 2014-08-15 | 2019-07-02 | Apple Inc. | Authenticated device used to unlock another device |
EP3189406B1 (en) | 2014-09-02 | 2022-09-07 | Apple Inc. | Phone user interface |
US10055121B2 (en) | 2015-03-07 | 2018-08-21 | Apple Inc. | Activity based thresholds and feedbacks |
US9990107B2 (en) | 2015-03-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US9645732B2 (en) | 2015-03-08 | 2017-05-09 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US9639184B2 (en) | 2015-03-19 | 2017-05-02 | Apple Inc. | Touch input cursor manipulation |
US9785305B2 (en) | 2015-03-19 | 2017-10-10 | Apple Inc. | Touch input cursor manipulation |
US20170045981A1 (en) | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices and Methods for Processing Touch Inputs Based on Their Intensities |
US10067653B2 (en) | 2015-04-01 | 2018-09-04 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
US9674426B2 (en) | 2015-06-07 | 2017-06-06 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10157410B2 (en) | 2015-07-14 | 2018-12-18 | Ebay Inc. | Enhanced shopping actions on a mobile device |
US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US20170046058A1 (en) * | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices, Methods, and Graphical User Interfaces for Adjusting User Interface Objects |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10416800B2 (en) * | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
KR102426695B1 (en) * | 2015-10-20 | 2022-07-29 | 삼성전자주식회사 | Screen outputting method and electronic device supporting the same |
CN105242807B (en) * | 2015-10-20 | 2018-03-27 | 广东欧珀移动通信有限公司 | A kind of message inspection method and terminal |
JP6711632B2 (en) * | 2016-02-08 | 2020-06-17 | キヤノン株式会社 | Information processing apparatus, information processing method, and program |
US10514768B2 (en) * | 2016-03-15 | 2019-12-24 | Fisher-Rosemount Systems, Inc. | Gestures and touch in operator interface |
DK179186B1 (en) | 2016-05-19 | 2018-01-15 | Apple Inc | REMOTE AUTHORIZATION TO CONTINUE WITH AN ACTION |
US10739972B2 (en) | 2016-06-10 | 2020-08-11 | Apple Inc. | Device, method, and graphical user interface for managing electronic communications |
AU2017100667A4 (en) | 2016-06-11 | 2017-07-06 | Apple Inc. | Activity and workout updates |
DK201670622A1 (en) | 2016-06-12 | 2018-02-12 | Apple Inc | User interfaces for transactions |
DK179411B1 (en) | 2016-09-06 | 2018-06-06 | Apple Inc | Devices and methods for processing and rendering touch inputs unambiguous using intensity thresholds based on a prior input intensity |
CN109661644B (en) * | 2016-09-23 | 2022-07-29 | 华为技术有限公司 | Pressure touch method and terminal |
US9870098B1 (en) | 2016-09-27 | 2018-01-16 | International Business Machines Corporation | Pressure-sensitive touch screen display and method |
US9715307B1 (en) | 2016-10-31 | 2017-07-25 | International Business Machines Corporation | Pressure-sensitive touch screen display and method |
US9958979B1 (en) | 2016-10-31 | 2018-05-01 | International Business Machines Corporation | Web server that renders a web page based on a client pressure profile |
US10733372B2 (en) * | 2017-01-10 | 2020-08-04 | Microsoft Technology Licensing, Llc | Dynamic content generation |
US10678422B2 (en) * | 2017-03-13 | 2020-06-09 | International Business Machines Corporation | Automatic generation of a client pressure profile for a touch screen device |
US10992795B2 (en) | 2017-05-16 | 2021-04-27 | Apple Inc. | Methods and interfaces for home media control |
US11431836B2 (en) | 2017-05-02 | 2022-08-30 | Apple Inc. | Methods and interfaces for initiating media playback |
CN111343060B (en) | 2017-05-16 | 2022-02-11 | 苹果公司 | Method and interface for home media control |
US20220279063A1 (en) | 2017-05-16 | 2022-09-01 | Apple Inc. | Methods and interfaces for home media control |
EP3422169A1 (en) * | 2017-06-29 | 2019-01-02 | Vestel Elektronik Sanayi ve Ticaret A.S. | Apparatus and method for operating an electronic device |
KR102363707B1 (en) * | 2017-08-03 | 2022-02-17 | 삼성전자주식회사 | An electronic apparatus comprising a force sensor and a method for controlling electronic apparatus thereof |
US10891033B2 (en) * | 2018-08-24 | 2021-01-12 | Microsoft Technology Licensing, Llc | System and method for enhanced touch selection of content |
US11579750B2 (en) * | 2018-12-14 | 2023-02-14 | Perksy, Inc. | Methods, systems, and apparatus, for receiving persistent responses to online surveys |
USD941829S1 (en) | 2018-12-31 | 2022-01-25 | Perksy, Inc. | Display screen with graphical user interface |
US10996917B2 (en) | 2019-05-31 | 2021-05-04 | Apple Inc. | User interfaces for audio media control |
JP7075547B2 (en) | 2019-05-31 | 2022-05-25 | アップル インコーポレイテッド | User interface for audio media control |
US11477609B2 (en) | 2019-06-01 | 2022-10-18 | Apple Inc. | User interfaces for location-related communications |
US11481094B2 (en) | 2019-06-01 | 2022-10-25 | Apple Inc. | User interfaces for location-related communications |
US11379113B2 (en) | 2019-06-01 | 2022-07-05 | Apple Inc. | Techniques for selecting text |
JP7235689B2 (en) * | 2020-02-26 | 2023-03-08 | Kddi株式会社 | Haptic sensation presentation method, system and program |
US11392291B2 (en) | 2020-09-25 | 2022-07-19 | Apple Inc. | Methods and interfaces for media control with dynamic feedback |
US11847378B2 (en) | 2021-06-06 | 2023-12-19 | Apple Inc. | User interfaces for audio routing |
JP2024051770A (en) * | 2022-09-30 | 2024-04-11 | 株式会社ワコム | Electronic pen, input system, and writing pressure adjustment method |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4918262A (en) * | 1989-03-14 | 1990-04-17 | Ibm Corporation | Touch sensing display screen signal processing apparatus and method |
US5184120A (en) * | 1991-04-04 | 1993-02-02 | Motorola, Inc. | Menu selection using adaptive force sensing resistor |
US5708460A (en) * | 1995-06-02 | 1998-01-13 | Avi Systems, Inc. | Touch screen |
US5936614A (en) * | 1991-04-30 | 1999-08-10 | International Business Machines Corporation | User defined keyboard entry system |
US5995106A (en) * | 1993-05-24 | 1999-11-30 | Sun Microsystems, Inc. | Graphical user interface for displaying and navigating in a directed graph structure |
US6104317A (en) * | 1998-02-27 | 2000-08-15 | Motorola, Inc. | Data entry device and method |
US20010024195A1 (en) * | 2000-03-21 | 2001-09-27 | Keisuke Hayakawa | Page information display method and device and storage medium storing program for displaying page information |
US20020084977A1 (en) * | 2000-10-27 | 2002-07-04 | Tadashi Nakamura | Electronic equipment and pointer display method |
US20020140680A1 (en) * | 2001-03-30 | 2002-10-03 | Koninklijke Philips Electronics N.V. | Handheld electronic device with touch pad |
US20020158851A1 (en) * | 2001-04-27 | 2002-10-31 | Masaki Mukai | Input device and inputting method with input device |
US20020180763A1 (en) * | 2001-06-05 | 2002-12-05 | Shao-Tsu Kung | Touch screen using pressure to control the zoom ratio |
US20020191029A1 (en) * | 2001-05-16 | 2002-12-19 | Synaptics, Inc. | Touch screen with user interface enhancement |
US20030043174A1 (en) * | 2001-08-29 | 2003-03-06 | Hinckley Kenneth P. | Automatic scrolling |
US6590568B1 (en) * | 2000-11-20 | 2003-07-08 | Nokia Corporation | Touch screen drag and drop input technique |
US20030193484A1 (en) * | 1999-01-07 | 2003-10-16 | Lui Charlton E. | System and method for automatically switching between writing and text input modes |
US6636203B1 (en) * | 2001-05-17 | 2003-10-21 | Palm, Inc. | Keyboard equivalent pad overlay encasement for a handheld electronic device |
US20050057531A1 (en) * | 2003-09-17 | 2005-03-17 | Joseph Patino | Method and system for generating characters |
US20050110769A1 (en) * | 2003-11-26 | 2005-05-26 | Dacosta Henry | Systems and methods for adaptive interpretation of input from a touch-sensitive input device |
US20060001654A1 (en) * | 2004-06-30 | 2006-01-05 | National Semiconductor Corporation | Apparatus and method for performing data entry with light based touch screen displays |
US20060019648A1 (en) * | 2004-07-06 | 2006-01-26 | Logitech Europe S.A. | Communication zone on a keyboard |
US7681889B2 (en) * | 2004-07-21 | 2010-03-23 | Eagle Industry Co., Ltd. | Seal Device |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0820924B2 (en) * | 1986-04-30 | 1996-03-04 | 株式会社東芝 | Handwriting display device |
US4914624A (en) * | 1988-05-06 | 1990-04-03 | Dunthorn David I | Virtual button for touch screen |
US5475401A (en) * | 1993-04-29 | 1995-12-12 | International Business Machines, Inc. | Architecture and method for communication of writing and erasing signals from a remote stylus to a digitizing display |
US5574840A (en) * | 1994-08-29 | 1996-11-12 | Microsoft Corporation | Method and system for selecting text utilizing a plurality of text using switchable minimum granularity of selection |
DE69625756T2 (en) * | 1995-08-11 | 2003-11-20 | Sharp Kk | Document processing device |
US5930813A (en) * | 1995-12-21 | 1999-07-27 | Adobe Systems Incorporated | Method and system for designating objects |
US6954899B1 (en) * | 1997-04-14 | 2005-10-11 | Novint Technologies, Inc. | Human-computer interface including haptically controlled interactions |
JPH11203044A (en) * | 1998-01-16 | 1999-07-30 | Sony Corp | Information processing system |
JPH11355617A (en) * | 1998-06-05 | 1999-12-24 | Fuji Photo Film Co Ltd | Camera with image display device |
US7032171B1 (en) * | 1998-12-31 | 2006-04-18 | International Business Machines Corporation | System and method for selecting and processing information in an electronic document |
US6147683A (en) * | 1999-02-26 | 2000-11-14 | International Business Machines Corporation | Graphical selection marker and method for lists that are larger than a display window |
US20070018970A1 (en) * | 2000-12-22 | 2007-01-25 | Logitech Europe S.A. | Optical slider for input devices |
US6803929B2 (en) * | 2001-07-05 | 2004-10-12 | International Business Machines Corporation | Method, apparatus and computer program product for moving or copying information |
US7685538B2 (en) * | 2003-01-31 | 2010-03-23 | Wacom Co., Ltd. | Method of triggering functions in a computer application using a digitizer having a stylus and a digitizer system |
JP3809424B2 (en) | 2003-03-17 | 2006-08-16 | 株式会社クレオ | Selection area control device, selection area control method, and selection area control program |
-
2004
- 2004-12-21 US US11/017,073 patent/US7629966B2/en not_active Expired - Fee Related
-
2009
- 2009-11-16 US US12/619,373 patent/US20100060606A1/en not_active Abandoned
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4918262A (en) * | 1989-03-14 | 1990-04-17 | Ibm Corporation | Touch sensing display screen signal processing apparatus and method |
US5184120A (en) * | 1991-04-04 | 1993-02-02 | Motorola, Inc. | Menu selection using adaptive force sensing resistor |
US5936614A (en) * | 1991-04-30 | 1999-08-10 | International Business Machines Corporation | User defined keyboard entry system |
US5995106A (en) * | 1993-05-24 | 1999-11-30 | Sun Microsystems, Inc. | Graphical user interface for displaying and navigating in a directed graph structure |
US5708460A (en) * | 1995-06-02 | 1998-01-13 | Avi Systems, Inc. | Touch screen |
US6104317A (en) * | 1998-02-27 | 2000-08-15 | Motorola, Inc. | Data entry device and method |
US20030193484A1 (en) * | 1999-01-07 | 2003-10-16 | Lui Charlton E. | System and method for automatically switching between writing and text input modes |
US20010024195A1 (en) * | 2000-03-21 | 2001-09-27 | Keisuke Hayakawa | Page information display method and device and storage medium storing program for displaying page information |
US20020084977A1 (en) * | 2000-10-27 | 2002-07-04 | Tadashi Nakamura | Electronic equipment and pointer display method |
US6590568B1 (en) * | 2000-11-20 | 2003-07-08 | Nokia Corporation | Touch screen drag and drop input technique |
US20020140680A1 (en) * | 2001-03-30 | 2002-10-03 | Koninklijke Philips Electronics N.V. | Handheld electronic device with touch pad |
US20020158851A1 (en) * | 2001-04-27 | 2002-10-31 | Masaki Mukai | Input device and inputting method with input device |
US20020191029A1 (en) * | 2001-05-16 | 2002-12-19 | Synaptics, Inc. | Touch screen with user interface enhancement |
US6636203B1 (en) * | 2001-05-17 | 2003-10-21 | Palm, Inc. | Keyboard equivalent pad overlay encasement for a handheld electronic device |
US20020180763A1 (en) * | 2001-06-05 | 2002-12-05 | Shao-Tsu Kung | Touch screen using pressure to control the zoom ratio |
US20030043174A1 (en) * | 2001-08-29 | 2003-03-06 | Hinckley Kenneth P. | Automatic scrolling |
US20050057531A1 (en) * | 2003-09-17 | 2005-03-17 | Joseph Patino | Method and system for generating characters |
US20050110769A1 (en) * | 2003-11-26 | 2005-05-26 | Dacosta Henry | Systems and methods for adaptive interpretation of input from a touch-sensitive input device |
US20060001654A1 (en) * | 2004-06-30 | 2006-01-05 | National Semiconductor Corporation | Apparatus and method for performing data entry with light based touch screen displays |
US20060019648A1 (en) * | 2004-07-06 | 2006-01-26 | Logitech Europe S.A. | Communication zone on a keyboard |
US7681889B2 (en) * | 2004-07-21 | 2010-03-23 | Eagle Industry Co., Ltd. | Seal Device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130044062A1 (en) * | 2011-08-16 | 2013-02-21 | Nokia Corporation | Method and apparatus for translating between force inputs and temporal inputs |
US9442572B2 (en) | 2013-03-15 | 2016-09-13 | Peter James Tooch | 5-key data entry system and accompanying interface |
Also Published As
Publication number | Publication date |
---|---|
US20060132456A1 (en) | 2006-06-22 |
US7629966B2 (en) | 2009-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9116612B2 (en) | Pressure sensitive controls | |
US7683889B2 (en) | Pressure based selection | |
US7629966B2 (en) | Hard tap | |
US6791536B2 (en) | Simulating gestures of a pointing device using a stylus and providing feedback thereto | |
US7486282B2 (en) | Size variant pressure eraser | |
JP3546337B2 (en) | User interface device for computing system and method of using graphic keyboard | |
CN205427823U (en) | Electronic device and device for performing text selection operation | |
CN102224488B (en) | Interpreting gesture input including introduction or removal of a point of contact while a gesture is in progress | |
CN104020850B (en) | The gesture operation carried out using multipoint sensing device | |
TWI433029B (en) | Systems, methods, and computer-readable media for invoking an electronic ink or handwriting interface | |
JP2006164238A (en) | Touchpad input information processing method and touchpad input information processing apparatus | |
JP2012048623A (en) | Information processing unit, parameter setting method, and program | |
CN111708478A (en) | Disambiguation of keyboard input | |
JP2010517197A (en) | Gestures with multipoint sensing devices | |
KR20110036005A (en) | Virtual touchpad | |
US12277308B2 (en) | Interactions between an input device and an electronic device | |
US20060262105A1 (en) | Pen-centric polyline drawing tool | |
CN106104450B (en) | How to select a part of the GUI | |
EP3610361B1 (en) | Multi-stroke smart ink gesture language | |
WO2009119716A1 (en) | Information processing system, information processing device, method, and program | |
Tu et al. | Text Pin: Improving text selection with mode-augmented handles on touchscreen mobile devices | |
KR100764765B1 (en) | How to Implement Help Output Button in Information and Communication Terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001 Effective date: 20141014 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ANSON, DAVID LEININGER ADOLPHSON;REEL/FRAME:051905/0268 Effective date: 20041220 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |