+

US9870135B2 - Time segment user interface - Google Patents

Time segment user interface Download PDF

Info

Publication number
US9870135B2
US9870135B2 US14/605,307 US201514605307A US9870135B2 US 9870135 B2 US9870135 B2 US 9870135B2 US 201514605307 A US201514605307 A US 201514605307A US 9870135 B2 US9870135 B2 US 9870135B2
Authority
US
United States
Prior art keywords
line
user input
display device
detected direction
received user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/605,307
Other versions
US20150212686A1 (en
Inventor
Thomas G. Hobbs
Samuel J. Smith
Mark A. Woolley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HOBBS, THOMAS G., SMITH, SAMUEL J., WOOLLEY, MARK A.
Publication of US20150212686A1 publication Critical patent/US20150212686A1/en
Application granted granted Critical
Publication of US9870135B2 publication Critical patent/US9870135B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting

Definitions

  • This invention relates to a method of operating a device comprising a display device, a user interface device and a processor, and to the device itself.
  • the invention addresses problems with specifying and segmenting a period of time using a graphical user interface (GUI).
  • GUI graphical user interface
  • a method operates a device that includes a display device, a user interface device, and a processor connected to the display device and the user interface device.
  • the processor connected to the display device receives a user input defining a line on the display device; detects one or more direction changes in the received user input of the defined line; and defines a line segment according to the detected direction changes, where a length of the line segment is a distance between a start of the defined line and a first detected direction change.
  • One or more processors segment a time period into multiple time segments, where each time segment corresponds in length to the defined line segment.
  • the display device then displays the segmented time period on the defined line segment.
  • a computer program product operates a device that has a display device, a user interface device, and a processor connected to the display device and the user interface device.
  • the computer program product includes a computer readable storage medium having program code embodied therewith.
  • the computer readable storage medium is not a transitory signal per se, and the program code is readable and executable by a processor to perform a method comprising: receiving a user input defining a line on the display device; detecting one or more direction changes in the received user input of the defined line; defining a line segment according to the detected direction changes, where a length of the line segment is a distance between two successive detected direction changes; segmenting a time period into a plurality of time segments, where each time segment corresponds in length to the defined line segment; and displaying, on the display device, the segmented time period on the defined line segment.
  • a computer system includes: a display device; a user interface device; a processor; a computer readable memory; a computer readable storage medium; first program instructions to receive a user input defining a line on the display device; second program instructions to detect one or more direction changes in the received user input of the defined line; third program instructions to define a plurality of line segments according to the detected direction changes, wherein a length of at least one of the line segments is a distance from a final detected direction change and an end of the defined line; fourth program instructions to segment a time period into a plurality of time segments, where each time segment corresponds in length to a defined line segment; and fifth program instructions to display, on the display device, the segmented time period on the defined line segment.
  • the first, second, third, fourth, and fifth program instructions are stored on said computer readable storage medium for execution by said processor via said computer readable memory.
  • FIG. 1 is a schematic diagram of a touchscreen device
  • FIG. 2 is a schematic diagram of the internal components of the touchscreen device
  • FIG. 3 is a schematic diagram of a graphical user interface
  • FIG. 4 is a schematic diagram of a second graphical user interface
  • FIG. 5 is a flowchart of a method of operating the touchscreen device.
  • FIG. 6 is a flowchart of a second embodiment of the method of operating the touchscreen device.
  • the present invention may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • FIG. 1 shows a touchscreen device 10 which is a modern smartphone that has wireless telephony capability but also has advanced display and processing functionality that mean that the device 10 has significant capabilities as a computing device.
  • the device 10 comprises a display device 12 , a user interface device 14 and a processor 16 connected to the display device 12 and the user interface device 14 , this is shown schematically in FIG. 2 .
  • the display and input functionalities are combined in the touchscreen 12 although from a technical point of view these are separate components.
  • the processor 16 controls the operation of the device and executes instructions from a computer program product, which can be provided from a computer readable medium such as a suitable storage medium.
  • High resolution images can be displayed by the touchscreen 12 .
  • the user interacts with the touchscreen 12 using their finger and/or a stylus, depending on the touch technology being used to implement the touchscreen 12 .
  • the touchscreen 12 may display an icon 18 , which the user can touch in order to launch an application that is identified by the respective icon 18 .
  • This user interaction is the same as a user would find on a conventional desktop computer where they would use a mouse, for example, to move a cursor onto an icon in order to launch the respective application, for example by double-clicking on a mouse button.
  • Sophisticated gestures can be recognized by the touchscreen 12 .
  • the user can swipe their finger(s) on the screen to achieve various different interactions with the application being currently accessed by the user.
  • the user can make selections by touching the touchscreen 12 in order to “press” virtual buttons being displayed by the touchscreen 12 .
  • a virtual keyboard can be displayed on the lower half of the touchscreen 12 , which the user can then type on to provide alphanumeric input.
  • FIG. 3 shows an example of a graphical user interface 20 for part of a calendar application.
  • the user will access the GUI 20 in order to create a meeting in the calendar.
  • This Figure shows a preferred implementation of the GUI control, which includes a dial 22 .
  • the Figure depicts an example input where the user wants to create and segment a one hour event. The user begins by drawing their finger from the 12 o′clock position to the 10 past positions whilst keeping the distance from the center of the dial the same to indicate one segment.
  • the touchscreen 12 identifies the gesture that is being made by the user.
  • the next segment is 20 minutes; the end of the segment is indicated by drawing the finger towards the center again to indicate the start of the final segment.
  • the final segment is finished and the end time of the event is indicated by the user releasing their finger from the touchscreen 12 .
  • the dotted line 24 is the processed route created by analyzing the user input and “snapping” the distance from the center to a set value.
  • the line 26 shows the actual line drawn by the user, but this is converted to the line 24 .
  • the number of total possible segments would depend on the device 10 and size of dial 22 .
  • a smaller device such as a watch may have a maximum of five segments to ensure accurate interpretation of the user input route.
  • a new segment could be indicated by moving the finger away from the center of the circle. A significant change in position is all that is necessary.
  • the dial 22 could represent any period of time (here it is one hour, but this can be changed by the user) and the user could continue their input past a complete rotation. For example a complete rotation could correspond to one hour, or one day or any arbitrary duration.
  • the device 10 detects the line 26 drawn by the user with their finger on the touchscreen 12 and converts this gesture into the on-screen line 24 , interpreting the user's rough input 26 into a smooth line 24 .
  • This line 24 can be drawn by the user in a single continuous gesture or could also be generated in several stages, depending upon the proficiency of the user with the touchscreen technology of the device 10 .
  • the dial 22 defaults to a set time such as one hour, but this can be changed by the user before they start drawing the line 24 , if it does not fit with the meeting length that they are attempting to define.
  • the user draws the line 24 to indicate the overall length of the meeting in terms of time and also to indicate segments in the meeting using the same line 24 , as it is drawn by the user.
  • the user input comprises a curve forming at least part of a circle and each direction change, which comprises a movement relative to the center of the circle, is used to indicate the different segments within the overall time period.
  • the user has drawn a line 24 that has three distinct segments, the first being of 10 minutes, the second being of 20 minutes and the third being of 30 minutes, with an overall time period of 60 minutes.
  • the user can indicate a time period of longer than 60 minutes in the same single action by continuing to draw past the start point of the line 24 .
  • the processor 16 of the device 10 detects this change of direction and uses the detected direction change as the start and/or end of a time segment within the overall time period defined by the user.
  • the processor 16 will match the direction change to the nearest suitable time period, which might be five minute time periods, rather than generating a very specific but impractical length of time.
  • the first change of direction may in fact be at 8 minutes and 23 seconds, but this is changed by the processor 16 to 10 minutes, as the nearest five minute step.
  • FIG. 4 illustrates a second option for the GUI 20 .
  • the user's line 24 is drawn as a straight line, rather than a curve.
  • the detected direction changes identified by the processor 16 when the user draws the line 24 comprise a movement substantially at right angles to the straight line.
  • the user is drawing a straight line with their finger and is moving their finger at right angles to indicate the presence of the start and/or end of segments.
  • Some devices 10 have displays that are far more elongate than square and these types of displays will favour the drawing of a straight line rather than the drawing of a curved line.
  • the overall time and individual segments will be presented to the user.
  • the user then has the option to label the individual segments within the overall time that has been assigned to the meeting by their original gesture.
  • this can be achieved through the conventional on-screen keyboard that can be utilized to enter alpha-numeric data.
  • the GUI 20 will accept the labelling data and attach it to the relevant segments of the meeting.
  • the user has drawn a line 24 that is indicating a meeting length of one hour.
  • the line 24 has been drawn by the user to define three separate segments, the first of which is 15 minutes in length, the second of which is 40 minutes in length and the third of which is 5 minutes in length. This has been indicated by two right-angled direction changes, which define the start and end of the different segments.
  • the first segment can now be labelled by the user as “introduction”, the second segment can now be labelled “debate” and the third segment can be labelled “summary”. In this way, the user can label the different segments of the meeting.
  • the method of operating the touchscreen device 10 is summarized in FIG. 5 .
  • the method comprises the steps of, firstly step S 5 . 1 , which comprises receiving a user input defining a line 24 .
  • this line 24 could be a curve or a straight line, for example.
  • the next step in the method comprises step S 5 . 2 , which comprises detecting one or more direction changes in the received user input of the defined line 24 .
  • step S 5 . 2 comprises detecting one or more direction changes in the received user input of the defined line 24 .
  • the processor 16 of the device 10 As the line 24 is being drawn by the user (in one or more gestures) then direction changes in that line is detected by the processor 16 of the device 10 .
  • step S 5 . 3 which comprises defining a plurality of line segments according to the detected direction changes, where the length of each line segment is relative to the length of the defined line 24 between either the start of the line 24 and the first direction change, two successive direction changes or the final direction change and the end of the line 24 .
  • the processor 16 assigns specific lengths to the individual parts of the line 24 that have been indicated by the user by the direction changes that were made when the line 24 was created.
  • Each direction change indicates the end of one segment and the start of the next segment.
  • Step S 5 . 4 comprises segmenting a time period into a plurality of time segments, each time segment corresponding in length to a defined line segment, and step S 5 . 5 comprises displaying the segmented time period.
  • the time period is defined from the overall length of the line 24 drawn by the user. Once the overall line 24 has been segmented into separate lengths by the processor 16 , then the overall time period is split up into corresponding segments. This is then displayed to the user, as detailed above. This can allow the user to label the segments, for example. Once the process is complete, then the user can use the segmented (and labelled) meeting schedule by sending it to requested attendees, for example, as is usual within calendar applications.
  • FIG. 6 shows a second flowchart of a more detailed specific embodiment of the methodology, using the GUI 20 of FIG. 3 on the touchscreen device 10 .
  • the user touches the touchscreen 12 to initiate the interaction.
  • the user begins drawing a curve on the dial 22 .
  • the processor 16 at step S 6 . 3 averages the input from the user to create a smooth line, as can be seen in FIG. 3 .
  • the processor 16 works out the distance(s) from the center of the dial 22 and assigns segments to the different user-drawn subdivisions of the line 24 .
  • step S 6 . 5 the user finishes drawing the curve by taking their finger off the touchscreen 12 .
  • the processer 16 analyzes the total rotation that the user defined with their identified gesture.
  • the processor 16 assigns the duration of the individual segments with a percentage of the total rotation.
  • the processor 16 provides a pop-up to allow the user to label all of the segments that have been recognized in step S 6 . 4 .
  • the user labels the segments at step S 6 . 10 , the processor 16 creates the necessary events within the overall time period and displays this to the user.
  • the user can use a single simple gesture to provide multiple different pieces of information to the processor 16 .
  • the user interacts with the touchscreen 12 to create a single line 24 that also contains changes of direction within the line 24 that indicate where breaks will occur, thereby defining individual segments within the overall time period. This is much simpler for a user, when compared to using drop-down menus, on a small screen device such as a smartphone.
  • the input method is not limited to use on a touchscreen device, although this is where the main benefit will be found.
  • a conventional desktop computing environment could also support the methodology, with the user moving the mouse to move an on-screen cursor to define the curve or straight line.
  • a method of operating a device comprising a display device, a user interface device and a processor connected to the display device and the user interface device, the method comprising the steps of receiving a user input defining a line, detecting one or more direction changes in the received user input of the defined line, defining a plurality of line segments according to the detected direction changes, the length of each line segment being relative to the length of the defined line between either the start of the defined line and the first detected direction change, two successive detected direction changes or the final detected direction change and the end of the defined line, segmenting a time period into a plurality of time segments, each time segment corresponding in length to a defined line segment, and displaying the segmented time period.
  • a device comprising a display device, a user interface device and a processor connected to the display device and the user interface device, the device arranged to receive a user input defining a line, detect one or more direction changes in the received user input of the defined line, define a plurality of line segments according to the detected direction changes, the length of each line segment being relative to the length of the defined line between either the start of the defined line and the first detected direction change, two successive detected direction changes or the final detected direction change and the end of the defined line, segment a time period into a plurality of time segments, each time segment corresponding in length to a defined line segment, and display the segmented time period.
  • a computer program product on a computer readable medium for operating a device comprising a display device, a user interface device and a processor connected to the display device and the user interface device, the product comprising instructions for receiving a user input defining a line, detecting one or more direction changes in the received user input of the defined line, defining a plurality of line segments according to the detected direction changes, the length of each line segment being relative to the length of the defined line between either the start of the defined line and the first detected direction change, two successive detected direction changes or the final detected direction change and the end of the defined line, segmenting a time period into a plurality of time segments, each time segment corresponding in length to a defined line segment, and displaying the segmented time period.
  • a computer program comprising computer program code stored on a computer-readable medium to, when loaded into a computer system and executed thereon, cause said computer system to perform all the steps of a method according to the first aspect.
  • the interface component is in the form of a dial/clock-like control. This component allows a user to perform a single circular movement to capture multiple pieces of information. The device will capture both the rotation and the distance from the center and then translate that into data that can be subsequently applied to something such as a meeting agenda.
  • a user can draw (part of) a circle with their finger on the touchscreen to indicate the length of a meeting (in terms of time) and can also indicate segments within the meeting by moving their finger towards the center of the circle as they draw a circular line.
  • the main advantages of the methodology are a single input that captures total time, required segments and the amount of time assigned to each segment. This saves time since fewer user actions are required which saves time and reduces complexity.
  • the methodology saves space since it takes up less display space making it ideal for mobile applications.
  • the allocation of time is easier to visualize when compared to drop-down date/time selections since it allows the user to visualize the segmented time.
  • the methodology is well suited to mobile devices and can be easily implemented on a small mobile device or watch.
  • the data provided by the input method is scalable data since all data recorded is based on a percentage of the total duration which means it can be scaled post creation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Computer Interaction (AREA)
  • Economics (AREA)
  • Quality & Reliability (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)

Abstract

A method operates a device that includes a display device, a user interface device, and a processor connected to the display device and the user interface device. The processor connected to the display device receives a user input defining a line on the display device; detects one or more direction changes in the received user input of the defined line; and defines a line segment according to the detected direction changes, where a length of the line segment is a distance between a start of the defined line and a first detected direction change. One or more processors segment a time period into multiple time segments, where each time segment corresponds in length to the defined line segment. The display device then displays the segmented time period on the defined line segment.

Description

BACKGROUND
This invention relates to a method of operating a device comprising a display device, a user interface device and a processor, and to the device itself. In a preferred embodiment, the invention addresses problems with specifying and segmenting a period of time using a graphical user interface (GUI).
Many modern computing devices have constraints on the available display area, for example smartphones and tablet computers have restrictions on the size of the display area, since the computing device must be small enough and light enough to be portable. This means that often the user's interaction with the GUI is far more complex and fiddly than is ideal. A typical example of such a user task would be to generate a structured meeting schedule that specifies the meeting duration and the segmentation of time corresponding to a desired agenda. Current solutions involve multiple UI components, such as multiple drop-down menus, making them awkward and time consuming to use. They also produce results that are visually poor and typically also require large amounts of screen real-estate making them unsuitable for modern devices such as smartphones and watches.
SUMMARY
In an embodiment of the present invention, a method operates a device that includes a display device, a user interface device, and a processor connected to the display device and the user interface device. The processor connected to the display device receives a user input defining a line on the display device; detects one or more direction changes in the received user input of the defined line; and defines a line segment according to the detected direction changes, where a length of the line segment is a distance between a start of the defined line and a first detected direction change. One or more processors segment a time period into multiple time segments, where each time segment corresponds in length to the defined line segment. The display device then displays the segmented time period on the defined line segment.
In an embodiment of the present invention, a computer program product operates a device that has a display device, a user interface device, and a processor connected to the display device and the user interface device. The computer program product includes a computer readable storage medium having program code embodied therewith. The computer readable storage medium is not a transitory signal per se, and the program code is readable and executable by a processor to perform a method comprising: receiving a user input defining a line on the display device; detecting one or more direction changes in the received user input of the defined line; defining a line segment according to the detected direction changes, where a length of the line segment is a distance between two successive detected direction changes; segmenting a time period into a plurality of time segments, where each time segment corresponds in length to the defined line segment; and displaying, on the display device, the segmented time period on the defined line segment.
In an embodiment of the present invention, a computer system includes: a display device; a user interface device; a processor; a computer readable memory; a computer readable storage medium; first program instructions to receive a user input defining a line on the display device; second program instructions to detect one or more direction changes in the received user input of the defined line; third program instructions to define a plurality of line segments according to the detected direction changes, wherein a length of at least one of the line segments is a distance from a final detected direction change and an end of the defined line; fourth program instructions to segment a time period into a plurality of time segments, where each time segment corresponds in length to a defined line segment; and fifth program instructions to display, on the display device, the segmented time period on the defined line segment. The first, second, third, fourth, and fifth program instructions are stored on said computer readable storage medium for execution by said processor via said computer readable memory.
BRIEF DESCRIPTION OF THE DRAWINGS
Preferred embodiments of the present invention will now be described, by way of example only, with reference to the following drawings, in which:
FIG. 1 is a schematic diagram of a touchscreen device,
FIG. 2 is a schematic diagram of the internal components of the touchscreen device,
FIG. 3 is a schematic diagram of a graphical user interface,
FIG. 4 is a schematic diagram of a second graphical user interface,
FIG. 5 is a flowchart of a method of operating the touchscreen device, and
FIG. 6 is a flowchart of a second embodiment of the method of operating the touchscreen device.
DETAILED DESCRIPTION
The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
FIG. 1 shows a touchscreen device 10 which is a modern smartphone that has wireless telephony capability but also has advanced display and processing functionality that mean that the device 10 has significant capabilities as a computing device. Such devices 10 are becoming ubiquitous amongst business professionals and are widely used for conventional business applications such as email and calendar functions. The device 10 comprises a display device 12, a user interface device 14 and a processor 16 connected to the display device 12 and the user interface device 14, this is shown schematically in FIG. 2. To the user, the display and input functionalities are combined in the touchscreen 12 although from a technical point of view these are separate components. The processor 16 controls the operation of the device and executes instructions from a computer program product, which can be provided from a computer readable medium such as a suitable storage medium.
High resolution images can be displayed by the touchscreen 12. The user interacts with the touchscreen 12 using their finger and/or a stylus, depending on the touch technology being used to implement the touchscreen 12. For example, the touchscreen 12 may display an icon 18, which the user can touch in order to launch an application that is identified by the respective icon 18. This user interaction is the same as a user would find on a conventional desktop computer where they would use a mouse, for example, to move a cursor onto an icon in order to launch the respective application, for example by double-clicking on a mouse button.
Sophisticated gestures can be recognized by the touchscreen 12. The user can swipe their finger(s) on the screen to achieve various different interactions with the application being currently accessed by the user. The user can make selections by touching the touchscreen 12 in order to “press” virtual buttons being displayed by the touchscreen 12. A virtual keyboard can be displayed on the lower half of the touchscreen 12, which the user can then type on to provide alphanumeric input. Some interactions are natural for the user, but often applications that are representing functions that work well in a conventional desktop computing environment do not work very well on the confines of the smartphone 10.
FIG. 3 shows an example of a graphical user interface 20 for part of a calendar application. The user will access the GUI 20 in order to create a meeting in the calendar. This Figure shows a preferred implementation of the GUI control, which includes a dial 22. The Figure depicts an example input where the user wants to create and segment a one hour event. The user begins by drawing their finger from the 12 o′clock position to the 10 past positions whilst keeping the distance from the center of the dial the same to indicate one segment. The touchscreen 12 identifies the gesture that is being made by the user.
The user then moves their finger towards the center of the dial 22 to indicate a new segment. The next segment is 20 minutes; the end of the segment is indicated by drawing the finger towards the center again to indicate the start of the final segment. The final segment is finished and the end time of the event is indicated by the user releasing their finger from the touchscreen 12. The dotted line 24 is the processed route created by analyzing the user input and “snapping” the distance from the center to a set value. The line 26 shows the actual line drawn by the user, but this is converted to the line 24.
The number of total possible segments would depend on the device 10 and size of dial 22. For example a smaller device such as a watch may have a maximum of five segments to ensure accurate interpretation of the user input route. Alternately a new segment could be indicated by moving the finger away from the center of the circle. A significant change in position is all that is necessary. The dial 22 could represent any period of time (here it is one hour, but this can be changed by the user) and the user could continue their input past a complete rotation. For example a complete rotation could correspond to one hour, or one day or any arbitrary duration.
The device 10 detects the line 26 drawn by the user with their finger on the touchscreen 12 and converts this gesture into the on-screen line 24, interpreting the user's rough input 26 into a smooth line 24. This line 24 can be drawn by the user in a single continuous gesture or could also be generated in several stages, depending upon the proficiency of the user with the touchscreen technology of the device 10. The dial 22 defaults to a set time such as one hour, but this can be changed by the user before they start drawing the line 24, if it does not fit with the meeting length that they are attempting to define.
The user draws the line 24 to indicate the overall length of the meeting in terms of time and also to indicate segments in the meeting using the same line 24, as it is drawn by the user. In the GUI 20 shown in FIG. 3, the user input comprises a curve forming at least part of a circle and each direction change, which comprises a movement relative to the center of the circle, is used to indicate the different segments within the overall time period. In the example of FIG. 3, the user has drawn a line 24 that has three distinct segments, the first being of 10 minutes, the second being of 20 minutes and the third being of 30 minutes, with an overall time period of 60 minutes. The user can indicate a time period of longer than 60 minutes in the same single action by continuing to draw past the start point of the line 24.
As the user draws the line 24, each time they introduce a direction change (in the example of FIG. 3 towards or away from the center of the dial 22), then the processor 16 of the device 10 detects this change of direction and uses the detected direction change as the start and/or end of a time segment within the overall time period defined by the user. The processor 16 will match the direction change to the nearest suitable time period, which might be five minute time periods, rather than generating a very specific but impractical length of time. For example, in FIG. 3, the first change of direction may in fact be at 8 minutes and 23 seconds, but this is changed by the processor 16 to 10 minutes, as the nearest five minute step.
FIG. 4 illustrates a second option for the GUI 20. In this embodiment, the user's line 24 is drawn as a straight line, rather than a curve. The detected direction changes identified by the processor 16 when the user draws the line 24 comprise a movement substantially at right angles to the straight line. Here the user is drawing a straight line with their finger and is moving their finger at right angles to indicate the presence of the start and/or end of segments. Some devices 10 have displays that are far more elongate than square and these types of displays will favour the drawing of a straight line rather than the drawing of a curved line.
Once the user has completed the line 24, which can be detected by the user removing their finger from the touchscreen 12 or by the selection of an on-screen finish “button”, then the overall time and individual segments will be presented to the user. The user then has the option to label the individual segments within the overall time that has been assigned to the meeting by their original gesture. On a touchscreen device 10 this can be achieved through the conventional on-screen keyboard that can be utilized to enter alpha-numeric data. The GUI 20 will accept the labelling data and attach it to the relevant segments of the meeting.
For example, in FIG. 4, the user has drawn a line 24 that is indicating a meeting length of one hour. The line 24 has been drawn by the user to define three separate segments, the first of which is 15 minutes in length, the second of which is 40 minutes in length and the third of which is 5 minutes in length. This has been indicated by two right-angled direction changes, which define the start and end of the different segments. The first segment can now be labelled by the user as “introduction”, the second segment can now be labelled “debate” and the third segment can be labelled “summary”. In this way, the user can label the different segments of the meeting.
The method of operating the touchscreen device 10 is summarized in FIG. 5. The method comprises the steps of, firstly step S5.1, which comprises receiving a user input defining a line 24. As discussed above, this line 24 could be a curve or a straight line, for example. The next step in the method comprises step S5.2, which comprises detecting one or more direction changes in the received user input of the defined line 24. As the line 24 is being drawn by the user (in one or more gestures) then direction changes in that line is detected by the processor 16 of the device 10.
The next step of the method is step S5.3, which comprises defining a plurality of line segments according to the detected direction changes, where the length of each line segment is relative to the length of the defined line 24 between either the start of the line 24 and the first direction change, two successive direction changes or the final direction change and the end of the line 24. In this way, the processor 16 assigns specific lengths to the individual parts of the line 24 that have been indicated by the user by the direction changes that were made when the line 24 was created. Each direction change indicates the end of one segment and the start of the next segment.
Step S5.4 comprises segmenting a time period into a plurality of time segments, each time segment corresponding in length to a defined line segment, and step S5.5 comprises displaying the segmented time period. The time period is defined from the overall length of the line 24 drawn by the user. Once the overall line 24 has been segmented into separate lengths by the processor 16, then the overall time period is split up into corresponding segments. This is then displayed to the user, as detailed above. This can allow the user to label the segments, for example. Once the process is complete, then the user can use the segmented (and labelled) meeting schedule by sending it to requested attendees, for example, as is usual within calendar applications.
FIG. 6 shows a second flowchart of a more detailed specific embodiment of the methodology, using the GUI 20 of FIG. 3 on the touchscreen device 10. At step S6.1, the user touches the touchscreen 12 to initiate the interaction. At step S6.2, the user begins drawing a curve on the dial 22. The processor 16, at step S6.3 averages the input from the user to create a smooth line, as can be seen in FIG. 3. At step S6.4, the processor 16 works out the distance(s) from the center of the dial 22 and assigns segments to the different user-drawn subdivisions of the line 24.
At step S6.5, the user finishes drawing the curve by taking their finger off the touchscreen 12. At step S6.6, the processer 16 analyzes the total rotation that the user defined with their identified gesture. At step S6.7, the processor 16 assigns the duration of the individual segments with a percentage of the total rotation. At step S6.8, the processor 16 provides a pop-up to allow the user to label all of the segments that have been recognized in step S6.4. At step S6.9, the user labels the segments at step S6.10, the processor 16 creates the necessary events within the overall time period and displays this to the user.
In this way, the user can use a single simple gesture to provide multiple different pieces of information to the processor 16. The user interacts with the touchscreen 12 to create a single line 24 that also contains changes of direction within the line 24 that indicate where breaks will occur, thereby defining individual segments within the overall time period. This is much simpler for a user, when compared to using drop-down menus, on a small screen device such as a smartphone. The input method is not limited to use on a touchscreen device, although this is where the main benefit will be found. A conventional desktop computing environment could also support the methodology, with the user moving the mouse to move an on-screen cursor to define the curve or straight line.
According to a first aspect of the present invention, there is provided a method of operating a device comprising a display device, a user interface device and a processor connected to the display device and the user interface device, the method comprising the steps of receiving a user input defining a line, detecting one or more direction changes in the received user input of the defined line, defining a plurality of line segments according to the detected direction changes, the length of each line segment being relative to the length of the defined line between either the start of the defined line and the first detected direction change, two successive detected direction changes or the final detected direction change and the end of the defined line, segmenting a time period into a plurality of time segments, each time segment corresponding in length to a defined line segment, and displaying the segmented time period.
According to a second aspect of the present invention, there is provided a device comprising a display device, a user interface device and a processor connected to the display device and the user interface device, the device arranged to receive a user input defining a line, detect one or more direction changes in the received user input of the defined line, define a plurality of line segments according to the detected direction changes, the length of each line segment being relative to the length of the defined line between either the start of the defined line and the first detected direction change, two successive detected direction changes or the final detected direction change and the end of the defined line, segment a time period into a plurality of time segments, each time segment corresponding in length to a defined line segment, and display the segmented time period.
According to a third aspect of the present invention, there is provided a computer program product on a computer readable medium for operating a device comprising a display device, a user interface device and a processor connected to the display device and the user interface device, the product comprising instructions for receiving a user input defining a line, detecting one or more direction changes in the received user input of the defined line, defining a plurality of line segments according to the detected direction changes, the length of each line segment being relative to the length of the defined line between either the start of the defined line and the first detected direction change, two successive detected direction changes or the final detected direction change and the end of the defined line, segmenting a time period into a plurality of time segments, each time segment corresponding in length to a defined line segment, and displaying the segmented time period.
According to a fourth aspect, there is provided a computer program comprising computer program code stored on a computer-readable medium to, when loaded into a computer system and executed thereon, cause said computer system to perform all the steps of a method according to the first aspect.
Owing to the invention, it is possible to provide an improved GUI methodology that allows a user to input a time period which is segmented in a single simple action. The length of the time period is determined from the length of the line inputted by the user. In a preferred embodiment, the interface component is in the form of a dial/clock-like control. This component allows a user to perform a single circular movement to capture multiple pieces of information. The device will capture both the rotation and the distance from the center and then translate that into data that can be subsequently applied to something such as a meeting agenda. On a touchscreen device such as a mobile phone, a user can draw (part of) a circle with their finger on the touchscreen to indicate the length of a meeting (in terms of time) and can also indicate segments within the meeting by moving their finger towards the center of the circle as they draw a circular line.
The main advantages of the methodology are a single input that captures total time, required segments and the amount of time assigned to each segment. This saves time since fewer user actions are required which saves time and reduces complexity. The methodology saves space since it takes up less display space making it ideal for mobile applications. The allocation of time is easier to visualize when compared to drop-down date/time selections since it allows the user to visualize the segmented time. The methodology is well suited to mobile devices and can be easily implemented on a small mobile device or watch. The data provided by the input method is scalable data since all data recorded is based on a percentage of the total duration which means it can be scaled post creation.
The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (13)

What is claimed is:
1. A method of operating a device comprising a display device, a user interface device, and a processor connected to the display device and the user interface device, the method comprising:
receiving, by the processor connected to the display device, a user input defining a line on the display device;
detecting, by the processor connected to the display device, one or more direction changes in the received user input of the defined line;
defining, by the processor connected to the display device, a line segment according to the detected direction changes, wherein a length of the line segment is a distance between a start of the defined line and a first detected direction change;
segmenting, by one or more processors and based on the detected direction changes in the received user input, the line into a plurality of time segments based on the detected direction changes in the received user input, wherein the line represents a predefined time period, and wherein each time segment from the plurality of time segments corresponds in position and length to a different sub-period of the predefined time period;
generating, by one or more processors, a segmented time line from the plurality of time segments created by the detected direction changes in the received user input, wherein the segmented time line defined by the received user input comprises a curve forming at least part of a circle, and wherein each detected direction change comprises a movement relative to a center of the circle; and
displaying, by one or more processors, the segmented time line on the display device.
2. The method according to claim 1, wherein the segmented time line defined by the received user input further comprises a straight line, and wherein each detected direction change comprises a movement substantially at right angles to the straight line.
3. The method according to claim 1, wherein the segmented time line represents a meeting schedule for a meeting, wherein each sub-period of the predefined time period is for a different topic of the meeting, and wherein the method further comprises:
displaying, on the display device, labels that describe each different topic of the meeting on a corresponding section of the segmented time line.
4. The method of claim 3, further comprising:
populating, by one or more processors, a calendar application with a meeting entry using information derived from the segmented time line and the labels for the meeting.
5. The method according to claim 1, further comprising:
prior to receiving the user input segmenting the line, displaying, on the screen device, an on-screen element on the display device as a guide to the users input, wherein the on-screen element is an overlaid template that is traced by a user to input a particular shape.
6. A computer program product for operating a device comprising a display device, a user interface device, and a processor connected to the display device and the user interface device, the computer program product comprising a computer readable storage medium having program code embodied therewith, wherein the computer readable storage medium is not a transitory signal per se, and wherein the program code is readable and executable by a processor to perform a method comprising:
receiving a user input defining a line on the display device;
detecting one or more direction changes in the received user input of the defined line;
defining a line segment according to the detected direction changes, wherein a length of the line segment is a distance between two successive detected direction changes;
segmenting, based on the detected direction changes in the received user input, the line into a plurality of time segments based on the detected direction changes in the received user input, wherein the line represents a predefined time period, and wherein each time segment from the plurality of time segments corresponds in position and length to a different sub-period of the predefined time period;
generating a segmented time line from the plurality of time segments created by the detected direction changes in the received user input, wherein the segmented time line defined by the received user input comprises a curve forming at least part of a circle, and wherein each detected direction change comprises a movement relative to a center of the circle; and
displaying, on the display device, the segmented time period on the defined line segment.
7. The computer program product of claim 6, wherein the line defined by the received user input further comprises a straight line, and wherein each detected direction change comprises a movement substantially at right angles to the straight line.
8. The computer program product of claim 6, wherein the method further comprises:
displaying a line derived from the received user input of the defined line.
9. The computer program product of claim 6, wherein the method further comprises:
prior to receiving the user input defining the line, displaying an on-screen element on the display device as a guide to the users input, wherein the on-screen element is an overlaid template that is traced by a user to input a particular shape.
10. A computer system comprising:
a display device;
a user interface device;
a processor;
a computer readable memory;
a computer readable storage medium;
first program instructions to receive a user input defining a line on the display device;
second program instructions to detect one or more direction changes in the received user input of the defined line;
third program instructions to define a plurality of line segments according to the detected direction changes, wherein a length of at least one of the line segments is a distance from a final detected direction change and an end of the defined line;
fourth program instructions to segment, based on detected direction changes in the received user input, the line into a plurality of time segments based on the detected direction changes in the received user input, wherein the line represents a predefined time period, and wherein each time segment from the plurality of time segments corresponds in position and length to a different sub-period of the predefined time period;
fifth program instructions to generate a segmented time line from the plurality of time segments created by the detected direction changes in the received user input, wherein the segmented time line defined by the received user input comprises a curve forming at least part of a circle, and wherein each detected direction change comprises a movement relative to a center of the circle;
sixth program instructions to display, on the display device, the segmented time period on the defined line segment;
and wherein said first, second, third, fourth, fifth and sixth program instructions are stored on said computer readable storage medium for execution by said processor via said computer readable memory.
11. The computer system of claim 10, wherein the line defined by the received user input further comprises a straight line, and wherein each detected direction change comprises a movement substantially at right angles to the straight line.
12. The computer system of claim 10, further comprising:
seventh program instructions to display a line derived from the received user input of the defined line, wherein said seventh program instructions are stored on said computer readable storage medium for execution by said processor via said computer readable memory.
13. The computer system of claim 10, further comprising:
seventh program instructions to, prior to receiving the user input defining the line, display an on-screen element on the display device as a guide to the users input, wherein the on-screen element is an overlaid template that is traced by a user to input a particular shape, and wherein said seventh program instructions are stored on said computer readable storage medium for execution by said processor via said computer readable memory.
US14/605,307 2014-01-29 2015-01-26 Time segment user interface Active 2035-10-25 US9870135B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1401477.3 2014-01-29
GB1401477.3A GB2522622A (en) 2014-01-29 2014-01-29 Time segment user interface

Publications (2)

Publication Number Publication Date
US20150212686A1 US20150212686A1 (en) 2015-07-30
US9870135B2 true US9870135B2 (en) 2018-01-16

Family

ID=50287722

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/605,307 Active 2035-10-25 US9870135B2 (en) 2014-01-29 2015-01-26 Time segment user interface

Country Status (2)

Country Link
US (1) US9870135B2 (en)
GB (1) GB2522622A (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018068856A1 (en) * 2016-10-13 2018-04-19 Ratemytate Gmbh Electronic device and method to facilitate the manual input of numerical values
US11741433B2 (en) * 2019-05-22 2023-08-29 Victor Song Interactive scheduling, visualization, and tracking of activities

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4870759A (en) 1988-10-06 1989-10-03 Burton John F Activity entry apparatus for making time entries on pre-established time log forms
WO2002035304A2 (en) 2000-10-26 2002-05-02 Honeywell International Inc. Graphical user interface system for a thermal comfort controller
US20060092177A1 (en) 2004-10-30 2006-05-04 Gabor Blasko Input method and apparatus using tactile guidance and bi-directional segmented stroke
US20070236475A1 (en) 2006-04-05 2007-10-11 Synaptics Incorporated Graphical scroll wheel
US7992102B1 (en) 2007-08-03 2011-08-02 Incandescent Inc. Graphical user interface with circumferentially displayed search results
US8102380B2 (en) 2007-08-30 2012-01-24 Kabushiki Kaisha Toshiba Information processing device, program and method to detect hand rotation gestures
US20120066629A1 (en) 2010-09-15 2012-03-15 Seungwon Lee Method and apparatus for displaying schedule in mobile communication terminal
US20120092267A1 (en) 2010-10-15 2012-04-19 Sap Ag Touch-enabled circle control for time and date entry
US20120151401A1 (en) 2010-12-14 2012-06-14 Samsung Electronics Co. Ltd. Method and apparatus for controlling touch screen using timeline bar, recording medium with program for the same recorded therein, and user terminal having the same
US20120262386A1 (en) 2011-04-15 2012-10-18 Hyuntaek Kwon Touch based user interface device and method
US20130027412A1 (en) 2010-04-14 2013-01-31 Colimote Limited Programmable controllers and schedule timers
US20130097551A1 (en) 2011-10-14 2013-04-18 Edward P.A. Hogan Device, Method, and Graphical User Interface for Data Input Using Virtual Sliders
US20130283213A1 (en) * 2012-03-26 2013-10-24 Primesense Ltd. Enhanced virtual touchpad
US9009188B1 (en) * 2012-06-12 2015-04-14 Google Inc. Drawing-based search queries
US9142056B1 (en) * 2011-05-18 2015-09-22 Disney Enterprises, Inc. Mixed-order compositing for images having three-dimensional painting effects
US9317937B2 (en) * 2013-12-30 2016-04-19 Skribb.it Inc. Recognition of user drawn graphical objects based on detected regions within a coordinate-plane
US20160162603A1 (en) * 2014-12-08 2016-06-09 Dassault Systemes Solidworks Corporation Interactive Surface Alignment

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4870759A (en) 1988-10-06 1989-10-03 Burton John F Activity entry apparatus for making time entries on pre-established time log forms
WO2002035304A2 (en) 2000-10-26 2002-05-02 Honeywell International Inc. Graphical user interface system for a thermal comfort controller
US20060092177A1 (en) 2004-10-30 2006-05-04 Gabor Blasko Input method and apparatus using tactile guidance and bi-directional segmented stroke
US20070236475A1 (en) 2006-04-05 2007-10-11 Synaptics Incorporated Graphical scroll wheel
US7992102B1 (en) 2007-08-03 2011-08-02 Incandescent Inc. Graphical user interface with circumferentially displayed search results
US8102380B2 (en) 2007-08-30 2012-01-24 Kabushiki Kaisha Toshiba Information processing device, program and method to detect hand rotation gestures
US20130027412A1 (en) 2010-04-14 2013-01-31 Colimote Limited Programmable controllers and schedule timers
US20120066629A1 (en) 2010-09-15 2012-03-15 Seungwon Lee Method and apparatus for displaying schedule in mobile communication terminal
US20120092267A1 (en) 2010-10-15 2012-04-19 Sap Ag Touch-enabled circle control for time and date entry
US20120151401A1 (en) 2010-12-14 2012-06-14 Samsung Electronics Co. Ltd. Method and apparatus for controlling touch screen using timeline bar, recording medium with program for the same recorded therein, and user terminal having the same
US20120262386A1 (en) 2011-04-15 2012-10-18 Hyuntaek Kwon Touch based user interface device and method
US9142056B1 (en) * 2011-05-18 2015-09-22 Disney Enterprises, Inc. Mixed-order compositing for images having three-dimensional painting effects
US20130097551A1 (en) 2011-10-14 2013-04-18 Edward P.A. Hogan Device, Method, and Graphical User Interface for Data Input Using Virtual Sliders
US20130283213A1 (en) * 2012-03-26 2013-10-24 Primesense Ltd. Enhanced virtual touchpad
US9009188B1 (en) * 2012-06-12 2015-04-14 Google Inc. Drawing-based search queries
US9317937B2 (en) * 2013-12-30 2016-04-19 Skribb.it Inc. Recognition of user drawn graphical objects based on detected regions within a coordinate-plane
US20160162603A1 (en) * 2014-12-08 2016-06-09 Dassault Systemes Solidworks Corporation Interactive Surface Alignment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A. Stetson, "7 Killer User Interface Designs for Gestures", Fresh Tilled Soil, freshtilledsoil.com, Jul. 17, 2012, pp. 1-9.
Anonymous, "Touch Screen Gestures", Embedded Interaction Lab, www.embeddedinteractions.com, retrieved Jan. 26, 2015, pp. 1-9.

Also Published As

Publication number Publication date
US20150212686A1 (en) 2015-07-30
GB201401477D0 (en) 2014-03-12
GB2522622A (en) 2015-08-05

Similar Documents

Publication Publication Date Title
CN106462834B (en) Locating events on a timeline
KR102322718B1 (en) Radial menu user interface with entry point maintenance
US10831356B2 (en) Controlling visualization of data by a dashboard widget
US10394437B2 (en) Custom widgets based on graphical user interfaces of applications
US11150739B2 (en) Chinese character entry via a Pinyin input method
US10956032B2 (en) Keyboard utility for inputting data into a mobile application
US20140089824A1 (en) Systems And Methods For Dynamically Altering A User Interface Based On User Interface Actions
US9519570B2 (en) Progressive snapshots in automated software testing
US20160124931A1 (en) Input of electronic form data
EP2960763A1 (en) Computerized systems and methods for cascading user interface element animations
US11169701B2 (en) Display of a virtual keyboard on a supplemental physical display plane surrounding a primary physical display plane on a wearable mobile device
US10761717B2 (en) Controlling application launch
US11169652B2 (en) GUI configuration
US10394423B2 (en) Efficient list traversal
US9870135B2 (en) Time segment user interface
US10437410B2 (en) Conversation sub-window
US10089001B2 (en) Operating system level management of application display
US10474356B2 (en) Virtual keyboard improvement
US9916084B2 (en) Enlarging or reducing an image on a display screen
US10120555B2 (en) Cursor positioning on display screen
US10168893B2 (en) Identifying input interruption
CN109190097B (en) Method and apparatus for outputting information
US9766807B2 (en) Method and system for giving prompt about touch input operation
CN104951223A (en) Method and device for achieving magnifying lens on touch screen and host
US20120216144A1 (en) Electronic device and method for providing animated page

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HOBBS, THOMAS G.;SMITH, SAMUEL J.;WOOLLEY, MARK A.;SIGNING DATES FROM 20150108 TO 20150114;REEL/FRAME:034812/0824

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载