US20160154566A1 - Adaptive running mode - Google Patents
Adaptive running mode Download PDFInfo
- Publication number
- US20160154566A1 US20160154566A1 US14/354,433 US201314354433A US2016154566A1 US 20160154566 A1 US20160154566 A1 US 20160154566A1 US 201314354433 A US201314354433 A US 201314354433A US 2016154566 A1 US2016154566 A1 US 2016154566A1
- Authority
- US
- United States
- Prior art keywords
- option
- contact
- display
- location
- mobile device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72475—User interfaces specially adapted for cordless or mobile telephones specially adapted for disabled users
- H04M1/72481—User interfaces specially adapted for cordless or mobile telephones specially adapted for disabled users for visually impaired users
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
Definitions
- a user of a mobile device may wish to use the device (e.g., interact with the mobile device user interface) when running or exercising.
- the user may interact with the user interface in order to play a song, change a song, check email, etc.
- the user may find it difficult to interact with the small user interface of a mobile device. For example, when the user is attempting to touch or select a first option on the user interface, the user may actually end up touching or selecting a second option, which may be near the first option, on the user interface. Therefore, there is a need to enable a user to control or interact with a mobile device when the user is in motion.
- Embodiments of the invention are directed to systems, methods and computer program products for enabling a user to control a mobile device when the user is in motion.
- An exemplary method comprises: determining a location of contact on a mobile device display, wherein the display presents at least one option; determining a first option located near the location of contact; magnifying the first option on the display such that the first option encloses the location of contact; and initiating execution of a function associated with the first option.
- the first option is at least one of the most logical option or the option nearest to the location of contact.
- magnifying the first option comprises increasing the dimensions of the first option.
- the magnified first option is highlighted or is presented in a different color from the first option.
- the magnified first option encloses the first option.
- an outline of the first option is visible inside the magnified first option.
- the contact is made using either a finger or an object.
- the mobile device comprises at least one of a mobile phone, a watch, a music player, a camera, or a tablet computing device.
- the contact is maintained for a predetermined period.
- determining a first option located near the location of contact comprises determining the first option is selectable.
- the location of contact is within an area of the first option.
- the location of contact is determined based on where the contact is released from the display, and the location of contact is not determined based on where the contact is initially detected on the display.
- the location of contact is determined based on where the contact is initially detected on the display, and the location of contact is not determined on where the contact is released from the display.
- the contact is determined based on a camera or a sensor associated the display.
- the method further comprises providing tactile or audio feedback to a user of the mobile device.
- an intermediary logic layer is provided between an existing application that is executed on the mobile device and sensor or camera logic input for determining the location of contact.
- the method further comprises enabling an adaptive running mode when the mobile device determines that the mobile device or the user is in motion.
- the contact comprises actual contact or virtual contact.
- virtual contact occurs when a user's finger or object hovers above the display.
- an apparatus for enabling a user to control a mobile device when the user is in motion.
- the apparatus comprises a display configured to present at least one option; a memory; a processor; and a module stored in the memory, executable by the processor, and configured to: determine a location of contact on the display; determine a first option located near the location of contact; magnify the first option on the display such that the first option encloses the location of contact; and initiate execution of a function associated with the first option.
- a computer program product enabling a user to control a mobile device when the user is in motion.
- the computer program product comprises a non-transitory computer-readable medium comprising a set of codes for causing a computer to: determine a location of contact on the display; determine a first option located near the location of contact; magnify the first option on the display such that the first option encloses the location of contact; and initiate execution of a function associated with the first option.
- another method, apparatus, and computer program product comprises identifying an initial contact on a mobile device display; determining that the initial contact is dragged across the display while maintaining contact on the display; determining that the initial contact is released from the display; determining a location associated with the initial contact first contacting the display or associated with the initial contact being released from the display; determining an option located near the location; and initiating execution of a function associated with the option.
- An apparatus and computer program product may be provided to execute this method.
- the method further comprises presenting a graphical lasso from the location to the determined option.
- the initial contact is associated with a first location on the display
- the release of the contact is associated with a second location on the display
- the method further comprises highlighting or magnifying a first option located near the first location when the initial contact is detected.
- the method when the release of the contact is detected, the method further comprises highlighting or magnifying a second option located near the second location, and restoring the first option to its original magnification or highlighting.
- the method further comprises progressively restoring the first option to its original magnification or highlighting, and progressively increasing the magnification or highlighting of the second option.
- another method, apparatus, and computer program product comprises determining a location of actual or predicted contact on a mobile device display, wherein the display presents at least one option; determining information presented in an area enclosing the location; highlighting or magnifying the information; and presenting the highlighted or magnified information to the user.
- An apparatus and computer program product may be provided to execute this method.
- FIG. 1 is an exemplary process flow for enabling a user to control a mobile device when the user is in motion, in accordance with embodiments of the present invention
- FIG. 2 is an exemplary user interface for enabling a user to control a mobile device when the user is in motion, in accordance with embodiments of the present invention
- FIG. 3 is an exemplary mobile device, in accordance with embodiments of the present invention.
- FIG. 4 is a diagram illustrating a rear view of exemplary external components of the mobile device depicted in FIG. 3 , in accordance with embodiments of the present invention.
- FIG. 5 is a diagram illustrating exemplary internal components of the mobile device depicted in FIG. 3 , in accordance with embodiments of the present invention.
- Embodiments of the invention are directed to systems, methods and computer program products for enabling a user to control a mobile device when the user is in motion (e.g., when the user is running, exercising, etc.).
- the present invention does not compensate for shifts in a mobile device display interface based on accelerometer input. Instead, the present invention directly interacts with a user via a dynamic display interface as described below.
- a user may establish initial contact (e.g., a user's finger or other object) with a display, drag the contact location on the display, and then release contact from the display.
- an option on the mobile device display is selected upon detecting the release of the contact from the display and not upon detecting initial contact on the display.
- the selected option may not be the option associated with the initial contact on the display.
- the selected option may be the option associated with the final contact on the display before releasing the contact from the display.
- the selected option may be the option associated with the initial contact on the display, and may not be the option associated with the final contact on the display.
- the selected option may be an option located on the path between the initial contact and the final contact locations.
- the mobile device determines that the option has been selected and initiates a function associated with the option. If the contact location is not determined to be located on top of any option on the display, the mobile device determines the option located nearest to the contact location. In some embodiments, if the contact location is not determined to be located on top of any option on the display (and is near one or more options), the mobile device determines the “best logical” option rather than the “nearest” contact location. For example, the “best logical” option may be based on the user's prior contact point (or the previous determined option) or may be based on the user's contact history over a predetermined period.
- the mobile device will determine that the “best logical” option is the “pause” option, and not the “play” option since a video is currently being played on the mobile device display. Therefore, in some embodiments, the best logical option is not the option nearest to the contact location.
- the mobile device subsequently magnifies the determined option so that the option area encloses the contact location and encloses the original option.
- the determined option may not be graphically magnified. Instead, the determined option may be highlighted (e.g., change in color, change in font, etc.).
- a graphical “lasso” or “rubber band” may be displayed from the contact point to the determined option. The “lasso” encloses the contact point and the determined option.
- the determined option may be stretched in a “lasso” or “rubber band” fashion from its position on the display to the contact point. This provides visual feedback to the user indicating the option selected by the user.
- magnification may additionally or alternatively refer to highlighting (with or without an increase in dimensions) or stretching an option on the display.
- the option is not magnified if the user's contact location falls on top of an option. In such embodiments, the option is magnified if the user's contact location does not fall on top of any option on the display. In alternate embodiments, the option is magnified regardless of whether or not the user's contact location falls on top an option.
- the contact has to be maintained for a period equal to or greater than a predetermined period in order for the mobile device to magnify the option. The period may be computed based on one or more of a period of initial contact, a period of final contact prior to release, and a period of drag between the initial contact and the final contact.
- an option may comprise a selectable option (e.g., information which when selected by the user links the user to more information).
- a display may include an integrated camera and/or a sensor.
- the camera and/or sensor may be located under, above, or on substantially the same surface level as the display. This functionality enables the mobile device to see (using the camera) or sense (using the sensor) where the user's finger or other object touches the display or hovers over the display (e.g., in the air) within a predetermined distance from the surface of the display.
- Information received from the camera and/or sensor may be used to determine the location (e.g., x, y, and z coordinates) of the finger or object with respect to the display (or a point on the display). This location may be referred to as the control location.
- the area surrounding the control location if the finger is touching the display, or under the control location, if the finger is hovering above the display, may be highlighted so that the user can receive visual feedback of the location of contact either before contact is made with the display (predicted or virtual contact) or during contact (actual contact) made with the display.
- the highlighted area may be presented in a different color compared to the rest of the display.
- the mobile device may provide tactile feedback at the location of actual or virtual contact.
- the mobile device may provide other feedback signals (e.g., an audio signal) upon detection of contact, detection of contact release, or at any point in time between the detection of contact and the detection of contact release.
- a magnification window may be presented on or above the control location on the display and the information in the control location (and/or the information located above, below, or on either side of the control location) may be presented in the magnification window.
- the magnification window may be presented in conjunction with a magnetic snapping mechanism.
- the control location When the user moves the user's finger or other object on the display or above the display, the control location, and consequently the magnification window, also moves.
- the magnification window may be highlighted or presented in a different color and may overlap any information located under the magnification window. Additionally, tactile feedback may be provided to the user as the user moves the control location.
- a magnification window may also be referred to as a magnifying glass. The present invention is not limited to presenting the magnification window for magnifying a cursor position associated with text input. Instead, the magnification window may be provided for any pre-existing mobile device applications.
- the various features of the invention may be executed by the mobile device when the mobile device is in an adaptive running mode.
- This mode is enabled when the mobile device determines that the user is in motion.
- the mobile device may determine that the user is in motion using a sensor (e.g., a gyroscope) that detects shaking of the mobile device (e.g., shaking with a speed greater than or equal to a predetermined speed).
- a sensor e.g., a gyroscope
- shaking of the mobile device e.g., shaking with a speed greater than or equal to a predetermined speed.
- a user may manually enable the adaptive running mode.
- the adaptive running mode may be triggered upon detection of a “long-press” event (e.g., when a user maintains contact with the display for equal to or greater than a predetermined period). Additionally, the adaptive running mode may be disengaged when the “long-press” event ends.
- FIG. 1 presents a process flow 100 for enabling a user to control a mobile device when the user is in motion.
- the various process blocks presented in FIG. 1 may be executed in an order that is different from that presented in FIG. 1 .
- the process flow comprises determining a location of contact on a mobile device display, wherein the display presents at least one option.
- the process flow comprises determining a first option located near the location of contact.
- the process flow comprises magnifying the first option on the display such that the first option encloses the location of contact.
- the process flow comprises initiating execution of a function associated with the first option.
- contact on the display may refer to a tap (e.g., single or multiple tap) or a push (e.g., single push or multiple pushes) on the display.
- FIG. 2 presents an exemplary interface 210 associated with a mobile device display.
- the interface 210 comprises several selectable options. As indicated in interfaces 220 and 230 , the user attempts to select options 222 and 232 . However, the user's contact locations are 221 and 231 . The mobile device determines that the options located nearest to contact locations 221 and 231 are options 222 and 232 . Consequently, the mobile device magnifies these options. The magnified options 224 and 234 are also presented in FIG. 2 . These magnified options provide the user with visual feedback of the user's selection. Additionally, the magnified options may snap (e.g., magnetically snap) to the user's contact locations so that the user receives tactile feedback of the user's selection.
- the magnified options may snap (e.g., magnetically snap) to the user's contact locations so that the user receives tactile feedback of the user's selection.
- option 222 is presented as a magnified option 224 .
- the magnified option 224 may be reduced to its original size.
- the user may move the contact point from contact point 221 to contact point 231 while maintaining contact with the display. During this movement, when the contact point is determined to be closer to option 232 rather than option 222 , the magnified option 224 is reduced to its original size, while the option 232 is presented as a magnified option 234 .
- the magnified option 224 progressively decreases to its original size 222 .
- the option 232 is progressively magnified 234 .
- FIG. 2 indicates an option being magnified, in other embodiments, the option may be highlighted (with or without changing the dimensions of the option) or stretched.
- FIG. 3 is a diagram illustrating a front view of external components of an exemplary mobile device.
- the mobile device illustrated in FIG. 3 is a mobile communication device (e.g., portable mobile communication device such as a mobile phone).
- the mobile device may be any other computing device such as a tablet computing device, a laptop computer, a watch, a music player, or the like, wherein the mobile device may or may not provide communication capability.
- the mobile device may perform any of the computing functions described herein.
- Housing 305 may include a structure configured to contain or at least partially contain components of mobile device 112 .
- housing 305 may be formed from plastic, metal or other natural or synthetic materials or combination(s) of materials and may be configured to support microphone 310 , speaker 320 , display 350 , and camera button 360 .
- Microphone 310 may include any component capable of transducing air pressure waves to a corresponding electrical signal. For example, a user may speak into microphone 310 during a telephone call.
- Speaker 320 may include any component capable of transducing an electrical signal to a corresponding sound wave. For example, a user may listen to music through speaker 320 .
- the display 350 may function as a touchpad or touchscreen.
- Touchpad may include any component capable of providing input to device 112 .
- Touchpad may include a standard telephone keypad or a QWERTY keypad.
- Touchpad may also include one or more special purpose keys.
- a user may utilize touchpad for entering information, such as text or a phone number, or activating a special function, such as placing a telephone call, playing various media, capturing a photo, setting various camera features (e.g., focus, zoom, etc.) or accessing an application.
- Display 350 may include any component capable of providing visual information.
- display 350 may be a liquid crystal display (LCD).
- display 350 may be any one of other display technologies, such as a plasma display panel (PDP), a field emission display (FED), a thin film transistor (TFT) display, etc.
- Display 350 may be utilized to display, for example, text, image, and/or video information.
- Display 350 may also operate as a view finder, as will be described later.
- a camera button 360 may also be provided that enables a user to take an image. However, in alternate embodiments, the camera button 360 may not be provided.
- mobile device 112 illustrated in FIG. 3 is exemplary in nature, mobile device 112 is intended to be broadly interpreted to include any type of electronic device that includes an image-capturing component.
- mobile device 112 may include a mobile phone, a personal digital assistant (PDA), a portable computer, a camera, or a watch.
- PDA personal digital assistant
- mobile device 112 may include, for example, security devices or military devices.
- FIG. 3 illustrates exemplary external components of mobile device 112
- mobile device 112 may contain fewer, different, or additional external components than the external components depicted in FIG. 3 .
- one or more external components of mobile device 112 may include the capabilities of one or more other external components of mobile device 112 .
- display 350 may be an input component (e.g., a touchscreen such as a capacitive touchscreen).
- the touchscreen may function as a keypad or a touchpad.
- the external components may be arranged differently than the external components depicted in FIG. 3 .
- FIG. 4 is a diagram illustrating a rear view of external components of the exemplary mobile device.
- mobile device 112 may include a camera 470 , a lens assembly 472 , a proximity sensor 476 , and a flash 474 .
- Camera 470 may include any component capable of capturing an image. Camera 470 may be a digital camera. Display 350 may operate as a view finder when a user of mobile device 112 operates camera 470 . Camera 470 may provide for adjustment of a camera setting. In one implementation, mobile device 112 may include camera software that is displayable on display 350 to allow a user to adjust a camera setting.
- Lens assembly 472 may include any component capable of manipulating light so that an image may be captured.
- Lens assembly 472 may include a number of optical lens elements.
- the optical lens elements may be of different shapes (e.g., convex, biconvex, plano-convex, concave, etc.) and different distances of separation.
- An optical lens element may be made from glass, plastic (e.g., acrylic), or plexiglass.
- the optical lens may be multicoated (e.g., an antireflection coating or an ultraviolet (UV) coating) to minimize unwanted effects, such as lens flare and inaccurate color.
- lens assembly 472 may be permanently fixed to camera 470 .
- lens assembly 472 may be interchangeable with other lenses having different optical characteristics.
- Lens assembly 472 may provide for a variable aperture size (e.g., adjustable f-number).
- Proximity sensor 476 may include any component capable of collecting and providing distance information that may be used to enable camera 470 to capture an image properly.
- proximity sensor 476 may include a proximity sensor that allows camera 470 to compute the distance to an object.
- proximity sensor 476 may include an acoustic proximity sensor.
- the acoustic proximity sensor may include a timing circuit to measure echo return of ultrasonic soundwaves.
- the proximity sensor may be used to determine a distance to one or more moving objects, which may or may not be in focus, either prior to, during, or after capturing of an image frame of a scene.
- proximity of an object to the mobile device may be calculated during a post-processing step (e.g., after capturing the image).
- the proximity sensor 476 may determine that a finger or object is located close to the display, and information provided by the proximity sensor 476 may be used to determine a control location on the display under the finger or object, wherein the finger or object is not touching the display.
- Flash 474 may include any type of light-emitting component to provide illumination when camera 470 captures an image.
- flash 474 may be a light-emitting diode (LED) flash (e.g., white LED) or a xenon flash.
- flash 474 may include a flash module.
- mobile device 112 may include fewer, additional, and/or different components than the exemplary external components depicted in FIG. 4 .
- camera 470 may be a film camera.
- flash 474 may be a portable flashgun.
- mobile device 112 may be a single-lens reflex camera.
- one or more external components of mobile device 112 may be arranged differently.
- FIG. 5 is a diagram illustrating internal components of the exemplary mobile device.
- mobile device 112 may include microphone 310 , speaker 320 , display 350 , camera 470 , a memory 500 , a transceiver 520 , and a control unit 530 .
- the control unit 530 may enable a user to switch between touchpad or display mode 540 .
- touchpad mode the display 350 functions as at least one of an input device (e.g., a numeric keypad or a QWERTY touchpad) or an output device.
- display mode the display 350 functions as an output device.
- the control unit 530 enables triggering an adaptive running mode (IARM) 550 as described herein.
- the camera 470 and the sensor 560 may be used to perform various process associated with the IARM mode as described herein.
- the mobile device 112 may also include a near-field communication (NFC) chip.
- the chip may be an active or passive chip that enables data to be transmitted from the mobile device 112 to a receiving terminal (or received at the mobile device 112 from a sending terminal).
- An active chip is activated using a power source located in the mobile device 112 .
- a passive chip is activated using an electromagnetic field of the receiving terminal.
- Memory 500 may include any type of storing component to store data and instructions related to the operation and use of mobile device 112 .
- memory 500 may include a memory component, such as a random access memory (RAM), a read only memory (ROM), and/or a programmable read only memory (PROM).
- RAM random access memory
- ROM read only memory
- PROM programmable read only memory
- memory 500 may include a storage component, such as a magnetic storage component (e.g., a hard drive) or other type of computer-readable or computer-executable medium.
- Memory 500 may also include an external storing component, such as a Universal Serial Bus (USB) memory stick, a digital camera memory card, and/or a Subscriber Identity Module (SIM) card.
- USB Universal Serial Bus
- SIM Subscriber Identity Module
- Memory 500 may include a code component 510 that includes computer-readable or computer-executable instructions to perform one or more functions. These functions include initiating and/or executing the processes described herein.
- the code component 510 may work in conjunction with one or more other hardware or software components associated with the mobile device 112 to initiate and/or execute the processes described herein. Additionally, code component 510 may include computer-readable or computer-executable instructions to provide other functionality other than as described herein.
- Transceiver 520 may include any component capable of transmitting and receiving information wirelessly or via a wired connection.
- transceiver 520 may include a radio circuit that provides wireless communication with a network or another device.
- Control unit 530 may include any logic that may interpret and execute instructions, and may control the overall operation of mobile device 112 .
- Logic as used herein, may include hardware, software, and/or a combination of hardware and software.
- Control unit 530 may include, for example, a general-purpose processor, a microprocessor, a data processor, a co-processor, and/or a network processor.
- Control unit 530 may access instructions from memory 500 , from other components of mobile device 112 , and/or from a source external to mobile device 112 (e.g., a network or another device).
- Control unit 530 may provide for different operational modes associated with mobile device 112 . Additionally, control unit 530 may operate in multiple modes simultaneously. For example, control unit 530 may operate in a camera mode, a music player mode, and/or a telephone mode. For example, when in camera mode, face-detection and tracking logic may enable mobile device 112 to detect and track multiple objects (e.g., the presence and position of each object's face) within an image to be captured.
- objects e.g., the presence and position of each object's face
- mobile device 112 may include fewer, additional, and/or different components than the exemplary internal components depicted in FIG. 5 .
- mobile device 112 may not include transceiver 520 .
- one or more internal components of mobile device 112 may include the capabilities of one or more other components of mobile device 112 .
- transceiver 520 and/or control unit 530 may include their own on-board memory.
- the present invention may include and/or be embodied as an apparatus (including, for example, a system, machine, device, computer program product, and/or the like), as a method (including, for example, a business method, computer-implemented process, and/or the like), or as any combination of the foregoing.
- embodiments of the present invention may take the form of an entirely business method embodiment, an entirely software embodiment (including firmware, resident software, micro-code, stored procedures in a database, etc.), an entirely hardware embodiment, or an embodiment combining business method, software, and hardware aspects that may generally be referred to herein as a “system.”
- embodiments of the present invention may take the form of a computer program product that includes a computer-readable storage medium having one or more computer-executable program code portions stored therein.
- a processor which may include one or more processors, may be “configured to” perform a certain function in a variety of ways, including, for example, by having one or more general-purpose circuits perform the function by executing one or more computer-executable program code portions embodied in a computer-readable medium, and/or by having one or more application-specific circuits perform the function.
- the computer-readable medium may include, but is not limited to, a non-transitory computer-readable medium, such as a tangible electronic, magnetic, optical, electromagnetic, infrared, and/or semiconductor system, device, and/or other apparatus.
- the non-transitory computer-readable medium includes a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a compact disc read-only memory (CD-ROM), and/or some other tangible optical and/or magnetic storage device.
- the computer-readable medium may be transitory, such as, for example, a propagation signal including computer-executable program code portions embodied therein.
- One or more computer-executable program code portions for carrying out operations of the present invention may include object-oriented, scripted, and/or unscripted programming languages, such as, for example, Java, Perl, Smalltalk, C++, SAS, SQL, Python, Objective C, JavaScript, and/or the like.
- the one or more computer-executable program code portions for carrying out operations of embodiments of the present invention are written in conventional procedural programming languages, such as the “C” programming languages and/or similar programming languages.
- the computer program code may alternatively or additionally be written in one or more multi-paradigm programming languages, such as, for example, F#.
- These one or more computer-executable program code portions may be provided to a processor of a general purpose computer, special purpose computer, and/or some other programmable data processing apparatus in order to produce a particular machine, such that the one or more computer-executable program code portions, which execute via the processor of the computer and/or other programmable data processing apparatus, create mechanisms for implementing the steps and/or functions represented by the flowchart(s) and/or block diagram block(s).
- the one or more computer-executable program code portions may be stored in a transitory and/or non-transitory computer-readable medium (e.g., a memory, etc.) that can direct, instruct, and/or cause a computer and/or other programmable data processing apparatus to function in a particular manner, such that the computer-executable program code portions stored in the computer-readable medium produce an article of manufacture including instruction mechanisms which implement the steps and/or functions specified in the flowchart(s) and/or block diagram block(s).
- a transitory and/or non-transitory computer-readable medium e.g., a memory, etc.
- the one or more computer-executable program code portions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus.
- this produces a computer-implemented process such that the one or more computer-executable program code portions which execute on the computer and/or other programmable apparatus provide operational steps to implement the steps specified in the flowchart(s) and/or the functions specified in the block diagram block(s).
- computer-implemented steps may be combined with, and/or replaced with, operator- and/or human-implemented steps in order to carry out an embodiment of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention is directed to systems, methods and computer program products for enabling a user to control a mobile device when the user is in motion. The invention could also be used for “visibility assistance,” either for vision-impaired users, or simply to visualize screen objects obscured by the finger or other pointing device. An exemplary method comprises: determining a location of contact on a mobile device display, wherein the display presents at least one option; determining an option located near the location of contact; magnifying the first option on the display such that the first option encloses the location of contact; and initiating execution of a function associated with the first option.
Description
- A user of a mobile device (e.g., a music player, a mobile phone, a watch, etc.) may wish to use the device (e.g., interact with the mobile device user interface) when running or exercising. For example, the user may interact with the user interface in order to play a song, change a song, check email, etc. However, when the user is in motion, the user may find it difficult to interact with the small user interface of a mobile device. For example, when the user is attempting to touch or select a first option on the user interface, the user may actually end up touching or selecting a second option, which may be near the first option, on the user interface. Therefore, there is a need to enable a user to control or interact with a mobile device when the user is in motion.
- Embodiments of the invention are directed to systems, methods and computer program products for enabling a user to control a mobile device when the user is in motion. An exemplary method comprises: determining a location of contact on a mobile device display, wherein the display presents at least one option; determining a first option located near the location of contact; magnifying the first option on the display such that the first option encloses the location of contact; and initiating execution of a function associated with the first option.
- In some embodiments, the first option is at least one of the most logical option or the option nearest to the location of contact.
- In some embodiments, magnifying the first option comprises increasing the dimensions of the first option.
- In some embodiments, the magnified first option is highlighted or is presented in a different color from the first option.
- In some embodiments, the magnified first option encloses the first option.
- In some embodiments, an outline of the first option is visible inside the magnified first option.
- In some embodiments, the contact is made using either a finger or an object.
- In some embodiments, the mobile device comprises at least one of a mobile phone, a watch, a music player, a camera, or a tablet computing device.
- In some embodiments, the contact is maintained for a predetermined period.
- In some embodiments, determining a first option located near the location of contact comprises determining the first option is selectable.
- In some embodiments, the location of contact is within an area of the first option.
- In some embodiments, the location of contact is determined based on where the contact is released from the display, and the location of contact is not determined based on where the contact is initially detected on the display.
- In some embodiments, the location of contact is determined based on where the contact is initially detected on the display, and the location of contact is not determined on where the contact is released from the display.
- In some embodiments, the contact is determined based on a camera or a sensor associated the display.
- In some embodiments, the method further comprises providing tactile or audio feedback to a user of the mobile device.
- In some embodiments, an intermediary logic layer is provided between an existing application that is executed on the mobile device and sensor or camera logic input for determining the location of contact.
- In some embodiments, the method further comprises enabling an adaptive running mode when the mobile device determines that the mobile device or the user is in motion.
- In some embodiments, the contact comprises actual contact or virtual contact.
- In some embodiments, virtual contact occurs when a user's finger or object hovers above the display.
- In some embodiments, an apparatus is provided for enabling a user to control a mobile device when the user is in motion. The apparatus comprises a display configured to present at least one option; a memory; a processor; and a module stored in the memory, executable by the processor, and configured to: determine a location of contact on the display; determine a first option located near the location of contact; magnify the first option on the display such that the first option encloses the location of contact; and initiate execution of a function associated with the first option.
- In some embodiments, a computer program product is provided enabling a user to control a mobile device when the user is in motion. The computer program product comprises a non-transitory computer-readable medium comprising a set of codes for causing a computer to: determine a location of contact on the display; determine a first option located near the location of contact; magnify the first option on the display such that the first option encloses the location of contact; and initiate execution of a function associated with the first option.
- In some embodiments, another method, apparatus, and computer program product are provided. The method comprises identifying an initial contact on a mobile device display; determining that the initial contact is dragged across the display while maintaining contact on the display; determining that the initial contact is released from the display; determining a location associated with the initial contact first contacting the display or associated with the initial contact being released from the display; determining an option located near the location; and initiating execution of a function associated with the option. An apparatus and computer program product may be provided to execute this method.
- In some embodiments, the method further comprises presenting a graphical lasso from the location to the determined option.
- In some embodiments, the initial contact is associated with a first location on the display, and the release of the contact is associated with a second location on the display, and the method further comprises highlighting or magnifying a first option located near the first location when the initial contact is detected.
- In some embodiments, when the release of the contact is detected, the method further comprises highlighting or magnifying a second option located near the second location, and restoring the first option to its original magnification or highlighting.
- In some embodiments, while the initial contact is being dragged across the display, the method further comprises progressively restoring the first option to its original magnification or highlighting, and progressively increasing the magnification or highlighting of the second option.
- In some embodiments, another method, apparatus, and computer program product are provided. The method comprises determining a location of actual or predicted contact on a mobile device display, wherein the display presents at least one option; determining information presented in an area enclosing the location; highlighting or magnifying the information; and presenting the highlighted or magnified information to the user. An apparatus and computer program product may be provided to execute this method.
- Having thus described embodiments of the invention in general terms, reference will now be made to the accompanying drawings, where:
-
FIG. 1 is an exemplary process flow for enabling a user to control a mobile device when the user is in motion, in accordance with embodiments of the present invention; -
FIG. 2 is an exemplary user interface for enabling a user to control a mobile device when the user is in motion, in accordance with embodiments of the present invention; -
FIG. 3 is an exemplary mobile device, in accordance with embodiments of the present invention; -
FIG. 4 is a diagram illustrating a rear view of exemplary external components of the mobile device depicted inFIG. 3 , in accordance with embodiments of the present invention; and -
FIG. 5 is a diagram illustrating exemplary internal components of the mobile device depicted inFIG. 3 , in accordance with embodiments of the present invention. - Embodiments of the present invention now may be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure may satisfy applicable legal requirements. Like numbers refer to like elements throughout.
- Embodiments of the invention are directed to systems, methods and computer program products for enabling a user to control a mobile device when the user is in motion (e.g., when the user is running, exercising, etc.). The present invention does not compensate for shifts in a mobile device display interface based on accelerometer input. Instead, the present invention directly interacts with a user via a dynamic display interface as described below.
- A user may establish initial contact (e.g., a user's finger or other object) with a display, drag the contact location on the display, and then release contact from the display. In some embodiments, an option on the mobile device display is selected upon detecting the release of the contact from the display and not upon detecting initial contact on the display. In such embodiments, when the user drags the user's finger or other object on the display, the selected option may not be the option associated with the initial contact on the display. Instead, the selected option may be the option associated with the final contact on the display before releasing the contact from the display. In alternate embodiments, the selected option may be the option associated with the initial contact on the display, and may not be the option associated with the final contact on the display. In still other embodiments, the selected option may be an option located on the path between the initial contact and the final contact locations.
- If the contact location is determined to be located on top of an option, the mobile device determines that the option has been selected and initiates a function associated with the option. If the contact location is not determined to be located on top of any option on the display, the mobile device determines the option located nearest to the contact location. In some embodiments, if the contact location is not determined to be located on top of any option on the display (and is near one or more options), the mobile device determines the “best logical” option rather than the “nearest” contact location. For example, the “best logical” option may be based on the user's prior contact point (or the previous determined option) or may be based on the user's contact history over a predetermined period. For example, if the user is watching a video on the mobile device display, and the user's contact point is between the “play” and “pause” options (but closer to the “play” option), the mobile device will determine that the “best logical” option is the “pause” option, and not the “play” option since a video is currently being played on the mobile device display. Therefore, in some embodiments, the best logical option is not the option nearest to the contact location.
- The mobile device subsequently magnifies the determined option so that the option area encloses the contact location and encloses the original option. Alternatively, the determined option may not be graphically magnified. Instead, the determined option may be highlighted (e.g., change in color, change in font, etc.). Alternatively or additionally, a graphical “lasso” or “rubber band” may be displayed from the contact point to the determined option. The “lasso” encloses the contact point and the determined option. Still alternatively, the determined option may be stretched in a “lasso” or “rubber band” fashion from its position on the display to the contact point. This provides visual feedback to the user indicating the option selected by the user. Additionally, the color of the magnified, highlighted, or stretched area may be different from the color of the original option area. However, an outline of the original option area may still be visible inside the magnified, highlighted, or stretched area. Additionally, the mobile device may provide tactile feedback to the user such that the user feels that the option magnetically snaps to the location of the user's contact. As used herein, magnification may additionally or alternatively refer to highlighting (with or without an increase in dimensions) or stretching an option on the display.
- In some embodiments, the option is not magnified if the user's contact location falls on top of an option. In such embodiments, the option is magnified if the user's contact location does not fall on top of any option on the display. In alternate embodiments, the option is magnified regardless of whether or not the user's contact location falls on top an option. In some embodiments, the contact has to be maintained for a period equal to or greater than a predetermined period in order for the mobile device to magnify the option. The period may be computed based on one or more of a period of initial contact, a period of final contact prior to release, and a period of drag between the initial contact and the final contact. As used herein, an option may comprise a selectable option (e.g., information which when selected by the user links the user to more information).
- Additionally, in some embodiments, a display may include an integrated camera and/or a sensor. The camera and/or sensor may be located under, above, or on substantially the same surface level as the display. This functionality enables the mobile device to see (using the camera) or sense (using the sensor) where the user's finger or other object touches the display or hovers over the display (e.g., in the air) within a predetermined distance from the surface of the display. Information received from the camera and/or sensor may be used to determine the location (e.g., x, y, and z coordinates) of the finger or object with respect to the display (or a point on the display). This location may be referred to as the control location. The area surrounding the control location, if the finger is touching the display, or under the control location, if the finger is hovering above the display, may be highlighted so that the user can receive visual feedback of the location of contact either before contact is made with the display (predicted or virtual contact) or during contact (actual contact) made with the display. The highlighted area may be presented in a different color compared to the rest of the display. Additionally or alternatively, the mobile device may provide tactile feedback at the location of actual or virtual contact. Additionally or alternatively, the mobile device may provide other feedback signals (e.g., an audio signal) upon detection of contact, detection of contact release, or at any point in time between the detection of contact and the detection of contact release.
- Additionally or alternatively, a magnification window may be presented on or above the control location on the display and the information in the control location (and/or the information located above, below, or on either side of the control location) may be presented in the magnification window. As described herein, the magnification window may be presented in conjunction with a magnetic snapping mechanism. When the user moves the user's finger or other object on the display or above the display, the control location, and consequently the magnification window, also moves. The magnification window may be highlighted or presented in a different color and may overlap any information located under the magnification window. Additionally, tactile feedback may be provided to the user as the user moves the control location. As used herein, a magnification window may also be referred to as a magnifying glass. The present invention is not limited to presenting the magnification window for magnifying a cursor position associated with text input. Instead, the magnification window may be provided for any pre-existing mobile device applications.
- The various features of the invention described herein may be enabled by using an intermediary logic layer between an application that is being executed on the mobile device and sensor or camera logic input that determines an actual or virtual contact location. This enables the present invention to be utilized with existing mobile applications without modification.
- The various features of the invention may be executed by the mobile device when the mobile device is in an adaptive running mode. This mode is enabled when the mobile device determines that the user is in motion. The mobile device may determine that the user is in motion using a sensor (e.g., a gyroscope) that detects shaking of the mobile device (e.g., shaking with a speed greater than or equal to a predetermined speed). Alternatively, a user may manually enable the adaptive running mode. Still alternatively, the adaptive running mode may be triggered upon detection of a “long-press” event (e.g., when a user maintains contact with the display for equal to or greater than a predetermined period). Additionally, the adaptive running mode may be disengaged when the “long-press” event ends.
- Referring now to
FIG. 1 ,FIG. 1 presents aprocess flow 100 for enabling a user to control a mobile device when the user is in motion. The various process blocks presented inFIG. 1 may be executed in an order that is different from that presented inFIG. 1 . Atblock 110, the process flow comprises determining a location of contact on a mobile device display, wherein the display presents at least one option. Atblock 120, the process flow comprises determining a first option located near the location of contact. Atblock 130, the process flow comprises magnifying the first option on the display such that the first option encloses the location of contact. Atblock 140, the process flow comprises initiating execution of a function associated with the first option. As used herein, contact on the display may refer to a tap (e.g., single or multiple tap) or a push (e.g., single push or multiple pushes) on the display. - Referring now to
FIG. 2 ,FIG. 2 presents anexemplary interface 210 associated with a mobile device display. Theinterface 210 comprises several selectable options. As indicated ininterfaces options locations options options FIG. 2 . These magnified options provide the user with visual feedback of the user's selection. Additionally, the magnified options may snap (e.g., magnetically snap) to the user's contact locations so that the user receives tactile feedback of the user's selection. - Another interpretation of the interfaces in
FIG. 2 is also possible. When the user'scontact location 221 is determined by the mobile device, the mobile device also determines that the nearest or most logical option isoption 222. Therefore,option 222 is presented as a magnifiedoption 224. When the contact point is lifted from the display, the magnifiedoption 224 may be reduced to its original size. Alternatively, the user may move the contact point fromcontact point 221 to contactpoint 231 while maintaining contact with the display. During this movement, when the contact point is determined to be closer to option 232 rather thanoption 222, the magnifiedoption 224 is reduced to its original size, while theoption 232 is presented as a magnifiedoption 234. Still alternatively, if the contact point, while maintaining contact with the display, moves fromcontact point 221 towardscontact point 231, the magnifiedoption 224 progressively decreases to itsoriginal size 222. As the contact point approachescontact point 231, theoption 232 is progressively magnified 234. AlthoughFIG. 2 indicates an option being magnified, in other embodiments, the option may be highlighted (with or without changing the dimensions of the option) or stretched. - Referring now to
FIG. 3 ,FIG. 3 is a diagram illustrating a front view of external components of an exemplary mobile device. The mobile device illustrated inFIG. 3 is a mobile communication device (e.g., portable mobile communication device such as a mobile phone). In alternate embodiments, the mobile device may be any other computing device such as a tablet computing device, a laptop computer, a watch, a music player, or the like, wherein the mobile device may or may not provide communication capability. The mobile device may perform any of the computing functions described herein. -
Housing 305 may include a structure configured to contain or at least partially contain components ofmobile device 112. For example,housing 305 may be formed from plastic, metal or other natural or synthetic materials or combination(s) of materials and may be configured to supportmicrophone 310,speaker 320,display 350, andcamera button 360. -
Microphone 310 may include any component capable of transducing air pressure waves to a corresponding electrical signal. For example, a user may speak intomicrophone 310 during a telephone call.Speaker 320 may include any component capable of transducing an electrical signal to a corresponding sound wave. For example, a user may listen to music throughspeaker 320. - The
display 350 may function as a touchpad or touchscreen. Touchpad may include any component capable of providing input todevice 112. Touchpad may include a standard telephone keypad or a QWERTY keypad. Touchpad may also include one or more special purpose keys. A user may utilize touchpad for entering information, such as text or a phone number, or activating a special function, such as placing a telephone call, playing various media, capturing a photo, setting various camera features (e.g., focus, zoom, etc.) or accessing an application. -
Display 350 may include any component capable of providing visual information. For example, in one implementation,display 350 may be a liquid crystal display (LCD). In another implementation,display 350 may be any one of other display technologies, such as a plasma display panel (PDP), a field emission display (FED), a thin film transistor (TFT) display, etc.Display 350 may be utilized to display, for example, text, image, and/or video information.Display 350 may also operate as a view finder, as will be described later. Acamera button 360 may also be provided that enables a user to take an image. However, in alternate embodiments, thecamera button 360 may not be provided. - Since
mobile device 112 illustrated inFIG. 3 is exemplary in nature,mobile device 112 is intended to be broadly interpreted to include any type of electronic device that includes an image-capturing component. For example,mobile device 112 may include a mobile phone, a personal digital assistant (PDA), a portable computer, a camera, or a watch. In other instances,mobile device 112 may include, for example, security devices or military devices. Accordingly, althoughFIG. 3 illustrates exemplary external components ofmobile device 112, in other implementations,mobile device 112 may contain fewer, different, or additional external components than the external components depicted inFIG. 3 . Additionally, or alternatively, one or more external components ofmobile device 112 may include the capabilities of one or more other external components ofmobile device 112. For example,display 350 may be an input component (e.g., a touchscreen such as a capacitive touchscreen). The touchscreen may function as a keypad or a touchpad. Additionally or alternatively, the external components may be arranged differently than the external components depicted inFIG. 3 . - Referring now to
FIG. 4 ,FIG. 4 is a diagram illustrating a rear view of external components of the exemplary mobile device. As illustrated, in addition to the components previously described,mobile device 112 may include acamera 470, alens assembly 472, a proximity sensor 476, and aflash 474. -
Camera 470 may include any component capable of capturing an image.Camera 470 may be a digital camera.Display 350 may operate as a view finder when a user ofmobile device 112 operatescamera 470.Camera 470 may provide for adjustment of a camera setting. In one implementation,mobile device 112 may include camera software that is displayable ondisplay 350 to allow a user to adjust a camera setting. -
Lens assembly 472 may include any component capable of manipulating light so that an image may be captured.Lens assembly 472 may include a number of optical lens elements. The optical lens elements may be of different shapes (e.g., convex, biconvex, plano-convex, concave, etc.) and different distances of separation. An optical lens element may be made from glass, plastic (e.g., acrylic), or plexiglass. The optical lens may be multicoated (e.g., an antireflection coating or an ultraviolet (UV) coating) to minimize unwanted effects, such as lens flare and inaccurate color. In one implementation,lens assembly 472 may be permanently fixed tocamera 470. In other implementations,lens assembly 472 may be interchangeable with other lenses having different optical characteristics.Lens assembly 472 may provide for a variable aperture size (e.g., adjustable f-number). - Proximity sensor 476 (not shown in
FIG. 4 ) may include any component capable of collecting and providing distance information that may be used to enablecamera 470 to capture an image properly. For example, proximity sensor 476 may include a proximity sensor that allowscamera 470 to compute the distance to an object. In another implementation, proximity sensor 476 may include an acoustic proximity sensor. The acoustic proximity sensor may include a timing circuit to measure echo return of ultrasonic soundwaves. In embodiments that include a proximity sensor 476, the proximity sensor may be used to determine a distance to one or more moving objects, which may or may not be in focus, either prior to, during, or after capturing of an image frame of a scene. In some embodiments, proximity of an object to the mobile device may be calculated during a post-processing step (e.g., after capturing the image). In still other embodiments, the proximity sensor 476 may determine that a finger or object is located close to the display, and information provided by the proximity sensor 476 may be used to determine a control location on the display under the finger or object, wherein the finger or object is not touching the display. -
Flash 474 may include any type of light-emitting component to provide illumination whencamera 470 captures an image. For example,flash 474 may be a light-emitting diode (LED) flash (e.g., white LED) or a xenon flash. In another implementation,flash 474 may include a flash module. - Although
FIG. 4 illustrates exemplary external components, in other implementations,mobile device 112 may include fewer, additional, and/or different components than the exemplary external components depicted inFIG. 4 . For example, in other implementations,camera 470 may be a film camera. Additionally, or alternatively, depending onmobile device 112,flash 474 may be a portable flashgun. Additionally, or alternatively,mobile device 112 may be a single-lens reflex camera. In still other implementations, one or more external components ofmobile device 112 may be arranged differently. - Referring now to
FIG. 5 ,FIG. 5 is a diagram illustrating internal components of the exemplary mobile device. As illustrated,mobile device 112 may includemicrophone 310,speaker 320,display 350,camera 470, amemory 500, atransceiver 520, and acontrol unit 530. Additionally, thecontrol unit 530 may enable a user to switch between touchpad ordisplay mode 540. In touchpad mode, thedisplay 350 functions as at least one of an input device (e.g., a numeric keypad or a QWERTY touchpad) or an output device. In display mode, thedisplay 350 functions as an output device. Additionally, thecontrol unit 530 enables triggering an adaptive running mode (IARM) 550 as described herein. Thecamera 470 and thesensor 560 may be used to perform various process associated with the IARM mode as described herein. - The
mobile device 112 may also include a near-field communication (NFC) chip. The chip may be an active or passive chip that enables data to be transmitted from themobile device 112 to a receiving terminal (or received at themobile device 112 from a sending terminal). An active chip is activated using a power source located in themobile device 112. A passive chip is activated using an electromagnetic field of the receiving terminal. -
Memory 500 may include any type of storing component to store data and instructions related to the operation and use ofmobile device 112. For example,memory 500 may include a memory component, such as a random access memory (RAM), a read only memory (ROM), and/or a programmable read only memory (PROM). Additionally,memory 500 may include a storage component, such as a magnetic storage component (e.g., a hard drive) or other type of computer-readable or computer-executable medium.Memory 500 may also include an external storing component, such as a Universal Serial Bus (USB) memory stick, a digital camera memory card, and/or a Subscriber Identity Module (SIM) card. -
Memory 500 may include acode component 510 that includes computer-readable or computer-executable instructions to perform one or more functions. These functions include initiating and/or executing the processes described herein. Thecode component 510 may work in conjunction with one or more other hardware or software components associated with themobile device 112 to initiate and/or execute the processes described herein. Additionally,code component 510 may include computer-readable or computer-executable instructions to provide other functionality other than as described herein. -
Transceiver 520 may include any component capable of transmitting and receiving information wirelessly or via a wired connection. For example,transceiver 520 may include a radio circuit that provides wireless communication with a network or another device. -
Control unit 530 may include any logic that may interpret and execute instructions, and may control the overall operation ofmobile device 112. Logic, as used herein, may include hardware, software, and/or a combination of hardware and software.Control unit 530 may include, for example, a general-purpose processor, a microprocessor, a data processor, a co-processor, and/or a network processor.Control unit 530 may access instructions frommemory 500, from other components ofmobile device 112, and/or from a source external to mobile device 112 (e.g., a network or another device). -
Control unit 530 may provide for different operational modes associated withmobile device 112. Additionally,control unit 530 may operate in multiple modes simultaneously. For example,control unit 530 may operate in a camera mode, a music player mode, and/or a telephone mode. For example, when in camera mode, face-detection and tracking logic may enablemobile device 112 to detect and track multiple objects (e.g., the presence and position of each object's face) within an image to be captured. - Although
FIG. 5 illustrates exemplary internal components, in other implementations,mobile device 112 may include fewer, additional, and/or different components than the exemplary internal components depicted inFIG. 5 . For example, in one implementation,mobile device 112 may not includetransceiver 520. In still other implementations, one or more internal components ofmobile device 112 may include the capabilities of one or more other components ofmobile device 112. For example,transceiver 520 and/orcontrol unit 530 may include their own on-board memory. - The various features described with respect to any embodiments described herein are applicable to any of the other embodiments described herein.
- Although many embodiments of the present invention have just been described above, the present invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Also, it will be understood that, where possible, any of the advantages, features, functions, devices, and/or operational aspects of any of the embodiments of the present invention described and/or contemplated herein may be included in any of the other embodiments of the present invention described and/or contemplated herein, and/or vice versa. In addition, where possible, any terms expressed in the singular form herein are meant to also include the plural form and/or vice versa, unless explicitly stated otherwise. As used herein, “at least one” shall mean “one or more” and these phrases are intended to be interchangeable. Accordingly, the terms “a” and/or “an” shall mean “at least one” or “one or more,” even though the phrase “one or more” or “at least one” is also used herein. Like numbers refer to like elements throughout.
- As will be appreciated by one of ordinary skill in the art in view of this disclosure, the present invention may include and/or be embodied as an apparatus (including, for example, a system, machine, device, computer program product, and/or the like), as a method (including, for example, a business method, computer-implemented process, and/or the like), or as any combination of the foregoing. Accordingly, embodiments of the present invention may take the form of an entirely business method embodiment, an entirely software embodiment (including firmware, resident software, micro-code, stored procedures in a database, etc.), an entirely hardware embodiment, or an embodiment combining business method, software, and hardware aspects that may generally be referred to herein as a “system.” Furthermore, embodiments of the present invention may take the form of a computer program product that includes a computer-readable storage medium having one or more computer-executable program code portions stored therein. As used herein, a processor, which may include one or more processors, may be “configured to” perform a certain function in a variety of ways, including, for example, by having one or more general-purpose circuits perform the function by executing one or more computer-executable program code portions embodied in a computer-readable medium, and/or by having one or more application-specific circuits perform the function.
- It will be understood that any suitable computer-readable medium may be utilized. The computer-readable medium may include, but is not limited to, a non-transitory computer-readable medium, such as a tangible electronic, magnetic, optical, electromagnetic, infrared, and/or semiconductor system, device, and/or other apparatus. For example, in some embodiments, the non-transitory computer-readable medium includes a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a compact disc read-only memory (CD-ROM), and/or some other tangible optical and/or magnetic storage device. In other embodiments of the present invention, however, the computer-readable medium may be transitory, such as, for example, a propagation signal including computer-executable program code portions embodied therein.
- One or more computer-executable program code portions for carrying out operations of the present invention may include object-oriented, scripted, and/or unscripted programming languages, such as, for example, Java, Perl, Smalltalk, C++, SAS, SQL, Python, Objective C, JavaScript, and/or the like. In some embodiments, the one or more computer-executable program code portions for carrying out operations of embodiments of the present invention are written in conventional procedural programming languages, such as the “C” programming languages and/or similar programming languages. The computer program code may alternatively or additionally be written in one or more multi-paradigm programming languages, such as, for example, F#.
- Some embodiments of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of apparatus and/or methods. It will be understood that each block included in the flowchart illustrations and/or block diagrams, and/or combinations of blocks included in the flowchart illustrations and/or block diagrams, may be implemented by one or more computer-executable program code portions. These one or more computer-executable program code portions may be provided to a processor of a general purpose computer, special purpose computer, and/or some other programmable data processing apparatus in order to produce a particular machine, such that the one or more computer-executable program code portions, which execute via the processor of the computer and/or other programmable data processing apparatus, create mechanisms for implementing the steps and/or functions represented by the flowchart(s) and/or block diagram block(s).
- The one or more computer-executable program code portions may be stored in a transitory and/or non-transitory computer-readable medium (e.g., a memory, etc.) that can direct, instruct, and/or cause a computer and/or other programmable data processing apparatus to function in a particular manner, such that the computer-executable program code portions stored in the computer-readable medium produce an article of manufacture including instruction mechanisms which implement the steps and/or functions specified in the flowchart(s) and/or block diagram block(s).
- The one or more computer-executable program code portions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus. In some embodiments, this produces a computer-implemented process such that the one or more computer-executable program code portions which execute on the computer and/or other programmable apparatus provide operational steps to implement the steps specified in the flowchart(s) and/or the functions specified in the block diagram block(s). Alternatively, computer-implemented steps may be combined with, and/or replaced with, operator- and/or human-implemented steps in order to carry out an embodiment of the present invention.
- While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other changes, combinations, omissions, modifications and substitutions, in addition to those set forth in the above paragraphs, are possible. Those skilled in the art will appreciate that various adaptations, modifications, and combinations of the just described embodiments can be configured without departing from the scope and spirit of the invention. Therefore, it is to be understood that, within the scope of the appended claims, the invention may be practiced other than as specifically described herein.
Claims (26)
1. A method for enabling a user to control a mobile device when the user is in motion, the method comprising:
determining a location of contact on a mobile device display, wherein the display presents at least one option;
determining a first option near the location of contact;
magnifying the first option on the display such that the first option encloses the location of contact; and
initiating execution of a function associated with the first option.
2. The method of claim 1 , wherein the first option is at least one of the most logical option or the option nearest to the location of contact.
3. The method of claim 1 , wherein magnifying the first option comprises increasing the dimensions of the first option.
4. The method of claim 1 , wherein the magnified first option is highlighted or is presented in a different color from the first option.
5. The method of claim 1 , wherein the magnified first option encloses the first option.
6. The method of claim 1 , wherein an outline of the first option is visible inside the magnified first option.
7. The method of claim 1 , wherein the contact is made using either a finger or an object.
8. The method of claim 1 , wherein the mobile device comprises at least one of a mobile phone, a watch, a music player, a camera, or a tablet computing device.
9. The method of claim 1 , wherein the contact is maintained for a predetermined period.
10. The method of claim 1 , wherein determining a first option located near the location of contact comprises determining the first option is selectable.
11. The method of claim 1 , wherein the location of contact is within an area of the first option.
12. The method of claim 1 , wherein the location of contact is determined based on where the contact is released from the display, and wherein the location of contact is not determined based on where the contact is initially detected on the display.
13. The method of claim 1 , wherein the location of contact is determined based on where the contact is initially detected on the display, and wherein the location of contact is not determined on where the contact is released from the display.
14. The method of claim 1 , wherein the contact is determined based on a camera or a sensor associated the display.
15. The method of claim 1 , further comprising providing tactile or audio feedback to a user of the mobile device.
16. The method of claim 1 , wherein an intermediary logic layer is provided between an existing application that is executed on the mobile device and sensor or camera logic input for determining the location of contact.
17. The method of claim 1 , further comprising enabling an adaptive running mode when the mobile device determines that the mobile device or the user is in motion.
18. The method of claim 1 , wherein the contact comprises actual contact or virtual contact.
19. (canceled)
20. (canceled)
21. (canceled)
22. (canceled)
23. (canceled)
24. (canceled)
25. An apparatus for enabling a user to control a mobile device when the user is in motion, the system comprising:
a memory;
a processor;
a module, stored in the memory, executable by the processor, and configured to:
determine a location of contact on a mobile device display, wherein the display presents at least one option;
determine a first option near the location of contact;
magnify the first option on the display such that the first option encloses the location of contact; and
initiate execution of a function associated with the first option.
26. A computer program product for enabling a user to control a mobile device when the user is in motion, the computer program product comprising a non-transitory computer-readable medium comprising a set of codes for causing a computer to:
determine a location of contact on a mobile device display, wherein the display presents at least one option;
determine a first option near the location of contact;
magnify the first option on the display such that the first option encloses the location of contact; and
initiate execution of a function associated with the first option.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2013/056816 WO2015025194A1 (en) | 2013-08-22 | 2013-08-22 | Adaptive running mode |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160154566A1 true US20160154566A1 (en) | 2016-06-02 |
Family
ID=49956251
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/354,433 Abandoned US20160154566A1 (en) | 2013-08-22 | 2013-08-22 | Adaptive running mode |
Country Status (3)
Country | Link |
---|---|
US (1) | US20160154566A1 (en) |
EP (1) | EP3036613A1 (en) |
WO (1) | WO2015025194A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170192511A1 (en) * | 2015-09-29 | 2017-07-06 | Telefonaktiebolaget Lm Ericsson (Publ) | Touchscreen Device and Method Thereof |
US20180192842A1 (en) * | 2015-07-28 | 2018-07-12 | Lg Electronics Inc. | Robot cleaner |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070180392A1 (en) * | 2006-01-27 | 2007-08-02 | Microsoft Corporation | Area frequency radial menus |
US8230355B1 (en) * | 2006-03-22 | 2012-07-24 | Adobe Systems Incorporated | Visual representation of a characteristic of an object in a space |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4406668C2 (en) * | 1993-04-27 | 1996-09-12 | Hewlett Packard Co | Method and device for operating a touch-sensitive display device |
US9189069B2 (en) * | 2000-07-17 | 2015-11-17 | Microsoft Technology Licensing, Llc | Throwing gestures for mobile devices |
JP4683126B2 (en) * | 2008-12-26 | 2011-05-11 | ブラザー工業株式会社 | Input device |
US10976784B2 (en) * | 2010-07-01 | 2021-04-13 | Cox Communications, Inc. | Mobile device user interface change based on motion |
CN102591578A (en) * | 2011-12-30 | 2012-07-18 | 鸿富锦精密工业(深圳)有限公司 | Electronic device with function of amplifying keys on touch screen and method |
-
2013
- 2013-08-22 US US14/354,433 patent/US20160154566A1/en not_active Abandoned
- 2013-08-22 WO PCT/IB2013/056816 patent/WO2015025194A1/en active Application Filing
- 2013-08-22 EP EP13821150.3A patent/EP3036613A1/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070180392A1 (en) * | 2006-01-27 | 2007-08-02 | Microsoft Corporation | Area frequency radial menus |
US8230355B1 (en) * | 2006-03-22 | 2012-07-24 | Adobe Systems Incorporated | Visual representation of a characteristic of an object in a space |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180192842A1 (en) * | 2015-07-28 | 2018-07-12 | Lg Electronics Inc. | Robot cleaner |
US10631699B2 (en) * | 2015-07-28 | 2020-04-28 | Lg Electronics Inc. | Robot cleaner |
US20170192511A1 (en) * | 2015-09-29 | 2017-07-06 | Telefonaktiebolaget Lm Ericsson (Publ) | Touchscreen Device and Method Thereof |
Also Published As
Publication number | Publication date |
---|---|
WO2015025194A1 (en) | 2015-02-26 |
EP3036613A1 (en) | 2016-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9134866B2 (en) | Dry/wet touch screen | |
US10534442B2 (en) | Method and wearable device for providing a virtual input interface | |
JP7238141B2 (en) | METHOD AND APPARATUS, ELECTRONIC DEVICE, STORAGE MEDIUM, AND COMPUTER PROGRAM FOR RECOGNIZING FACE AND HANDS | |
JP6310556B2 (en) | Screen control method and apparatus | |
US20200097093A1 (en) | Touch free interface for augmented reality systems | |
JP6043586B2 (en) | Electronic device, line-of-sight input program, and line-of-sight input method | |
EP2975838B1 (en) | Image shooting parameter adjustment method and device | |
US20130088434A1 (en) | Accessory to improve user experience with an electronic display | |
KR101184460B1 (en) | Device and method for controlling a mouse pointer | |
US9791923B2 (en) | Function of touch panel determined by user gaze | |
TW202113756A (en) | Image processing method and device, electronic equipment and storage medium | |
EP3154255B1 (en) | Imaging device and video generation method | |
EP3121701A1 (en) | Method and apparatus for single-hand operation on full screen | |
WO2014084224A1 (en) | Electronic device and line-of-sight input method | |
US11455836B2 (en) | Dynamic motion detection method and apparatus, and storage medium | |
EP3232301B1 (en) | Mobile terminal and virtual key processing method | |
CN107688385A (en) | A kind of control method and device | |
US20160125259A1 (en) | Smart feeling sensing tag for pictures | |
US20160154566A1 (en) | Adaptive running mode | |
US9451390B2 (en) | Magnetic battery saver | |
KR102158293B1 (en) | Method for capturing image and electronic device thereof | |
US20240402825A1 (en) | Active and Inactive Mode Transitions for User Input | |
KR20160079367A (en) | Method and apparatus for controlling smart device | |
CN112817552A (en) | Screen control method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RAKIC, MILAN;BUNK, RICHARD;SIGNING DATES FROM 20130821 TO 20130826;REEL/FRAME:033289/0115 |
|
AS | Assignment |
Owner name: SONY MOBILE COMMUNICATIONS INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONY CORPORATION;REEL/FRAME:038542/0224 Effective date: 20160414 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |