US20130339830A1 - Optimized document views for mobile device interfaces - Google Patents
Optimized document views for mobile device interfaces Download PDFInfo
- Publication number
- US20130339830A1 US20130339830A1 US13/524,175 US201213524175A US2013339830A1 US 20130339830 A1 US20130339830 A1 US 20130339830A1 US 201213524175 A US201213524175 A US 201213524175A US 2013339830 A1 US2013339830 A1 US 2013339830A1
- Authority
- US
- United States
- Prior art keywords
- document
- content
- client device
- portions
- textual content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000009471 action Effects 0.000 claims description 64
- 238000000034 method Methods 0.000 claims description 28
- 238000001514 detection method Methods 0.000 claims description 13
- 238000004091 panning Methods 0.000 claims description 2
- 238000003860 storage Methods 0.000 description 15
- 238000004891 communication Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000005674 electromagnetic induction Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/957—Browsing optimisation, e.g. caching or content distillation
- G06F16/9577—Optimising the visualization of content, e.g. distillation of HTML documents
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/103—Formatting, i.e. changing of presentation of documents
- G06F40/106—Display of layout of documents; Previewing
Definitions
- Modern communication systems may enable a user to view a document on a client device such as a mobile device, smart phone, tablet, or other personal computing device.
- a client device such as a mobile device, smart phone, tablet, or other personal computing device.
- a mobile device may have a smaller user interface as compared with display screens of larger computing devices, and when the document is viewed on the smaller user interface of the mobile device, the entire contents of the document may not all be visible at the same time. Sometimes only a portion of the document may be viewed at a time, and the remaining contents may extend outside of the viewing area of the user interface.
- the user may resize a portion of the document, such as a table or textual content, in order to optimally view the entire contents of a portion of the document on the user interface, and the resizing of the portion of the document may cause the entire document to be resized resulting in certain portions of the document to extend outside of the viewing window of the user interface where they are not visible and/or becoming an unreadable size.
- a portion of the document such as a table or textual content
- Embodiments are directed to optimizing view of documents on relatively smaller mobile device displays.
- a table inserted into a document such as a word processing document may be displayed in a larger size than the content around it for ease of viewability.
- a user may be enabled to move the table in a horizontal direction through swipe actions while the content around the table is preserved.
- textual content of a presentation slide may be displayed along with the slide (above or below the slide) on the mobile device display.
- a user may be enabled to navigate the slide through a horizontal swipe action, while the textual content may be scrolled through a vertical swipe action.
- FIG. 1 illustrates an example networked environment where embodiments may be implemented
- FIG. 2 illustrates an example user interface for optimally displaying textual and other content on a gesture enabled device, according to embodiments
- FIG. 3 illustrates an example user interface for individually resizing portions of content on a gesture enabled device, according to embodiments
- FIG. 4 illustrates an example slide show presentation with accompanying textual content viewed on a user interface, according to embodiments
- FIG. 5 illustrates an example split screen displaying a slide image and accompanying notes on a user interface, according to embodiments
- FIG. 6 is a networked environment, where a system according to embodiments may be implemented
- FIG. 7 is a block diagram of an example computing operating environment, where embodiments may be implemented.
- FIG. 8 illustrates a logic flow diagram for a process of separating portions of document contents into individually controlled sections on a user interface of a client device, according to embodiments.
- portions of document contents may be separated into individually controlled sections on a user interface of a small form client device.
- a table inserted into a document such as a word processing document may be displayed in a larger size than the content around it for ease of viewability.
- a user may be enabled to move the table in a horizontal direction through swipe actions while the content around the table is preserved.
- Textual content of a presentation slide may also be displayed along with the slide (above or below the slide) on the mobile device display.
- a user may be enabled to navigate the slide through a horizontal swipe action, while the textual content may be scrolled through a vertical swipe action.
- a document may include different content portions such as textual content, tables, slides and graphics.
- the document size may be optimized for viewing on the smaller interface of the client device.
- the optimum viewing size of a textual content portion may not be the optimum viewing size for other content portions such as tables and slides.
- program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types.
- embodiments may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and comparable computing devices.
- Embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
- program modules may be located in both local and remote memory storage devices.
- Embodiments may be implemented as a computer-implemented process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media.
- the computer program product may be a computer storage medium readable by a computer system and encoding a computer program that comprises instructions for causing a computer or computing system to perform example process(es).
- the computer-readable storage medium can for example be implemented via one or more of a volatile computer memory, a non-volatile memory, a hard drive, a flash drive, a floppy disk, or compact servers, an application executed on a single computing device, and comparable systems.
- server generally refers to a computing device executing one or more software programs typically in a networked environment. However, a server may also be implemented as a virtual server (software programs) executed on one or more computing devices viewed as a server on the network. More detail on these technologies and example operations is provided below.
- platform may be a combination of software and hardware components for separating portions of document contents into individually controlled sections on a user interface. Examples of platforms include, but are not limited to, a hosted service executed over a plurality of servers, an application executed on a single computing device, and comparable systems.
- server generally refers to a computing device executing one or more software programs typically in a networked environment. However, a server may also be implemented as a virtual server (software programs) executed on one or more computing devices viewed as a server on the network. More detail on these technologies and example operations is provided below.
- a gesture enabled input device and display screen may be utilized for viewing documents and receiving input from a user over a user interface.
- the gesture enabled input device and display screen may utilize any technology that allows touch input by a user to be recognized.
- some technologies may include, but are not limited to: heat, finger pressure, high capture rate cameras, infrared light, optic capture, tuned electromagnetic induction, ultrasonic receivers, transducer microphones, laser rangefinders, shadow capture, and the like.
- the user interface of a gesture enabled device may display content and documents associated with word processing applications, presentation applications, spreadsheet applications and web page content, and menus of actions for interacting with the displayed content.
- a user may use touch input actions to interact with the user interface to access, view and edit the content, such as documents, tables, spreadsheets, charts, lists, and any content (e.g., audio, video, etc.).
- the gesture enabled input device may make use of features specific to touch or gesture enabled computing devices, but may also work with a traditional mouse and keyboard.
- a touch input action, such as a tap or swipe action as used herein may be provided by a user through a finger, a pen, a mouse, or similar device, as well as through predefined keyboard entry combinations, eye-tracking, and/or a voice command.
- FIG. 1 illustrates an example networked environment where embodiments may be implemented.
- a user may utilize one or more devices, such as mobile devices 108 , smart phones 104 , 106 , tablets 102 , and other personal computing devices 114 to view documents, applications and web pages over the network.
- the devices such as smart phones 106 and mobile devices 108 may have small memory capacity and low processing or CPU speed compared with larger computing devices, and may operate in conjunction with a remote server over a cloud 110 for retrieving the data associated with a document viewed on the device.
- a remote server 112 residing in the cloud 110 may store, maintain, and manage application data for applications and documents executed on the client device, and through a browser executed on the client device, the client device may connect to the cloud for retrieving requested application data and loading the data onto the client device.
- the client device may connect to the server 112 on the cloud 110 via a networked environment, which may be a wireless or wired network as some examples, and likewise the application data may be delivered from the server 112 residing on the cloud 110 to the client device via the networked environment.
- the client device may retrieve the data over the network from the server 112 residing in the cloud 110 .
- the retrieved data may be retrieved and formatted for display on the client device in a format especially optimized for the size of the user interface of the client device.
- the client device including a mobile device 108 , tablet 102 , and/or smartphone 104 , 106 may have a smaller user interface as compared with display screens of larger computing devices.
- the entire contents of the document may not all be visible at the same time. Sometimes only a portion of the document may be viewed at a time, and the remaining contents may extend outside of the viewing area of the user interface. The user may need to scroll through the document in order to navigate through the entire contents of the document.
- a user may use a touch action or an optically captured gesture such as a tap, drag, and/or swipe to scroll through the document and navigate to subsequent pages and portions of the document.
- the user may also use pinching and expanding touch actions to zoom in and out of the document and resize the document as it is displayed on the user interface.
- a document may include additional elements and content portions along with textual content, such as tables, graphics, slides, and embedded audio/visual files.
- the user may resize a portion of the document, such as a table, in order to optimally view the selected portion, and the resizing of the portion may cause the entire document to be resized resulting in certain portions of the document may extending outside of the viewing window of the user interface and becoming unreadable.
- a portion of the document such as a table
- the optimal size of the textual content for viewing on the user interface of the client device may not be an optimal size for viewing the additional elements, such as embedded tables, graphics, slides, and audio/visual files.
- additional elements such as embedded tables, graphics, slides, and audio/visual files.
- a table included in the document may be magnified as well, such that the portions of the table extend outside of the user interface and are not viewable on the user interface. If the user resizes the table to make the entire table visible on the user interface, then the textual content of the document may be resized to an unreadable size.
- the user may optimize the size of slide for viewing on the user interface, causing textual content associated with the slide such as accompanying notes, not to be optimally visible on the user interface.
- the user may have to continually resize portions of the document, and pan and scroll through the document in order to effectively view all of the contents of the document.
- FIG. 2 illustrates an example user interface for optimally displaying textual and other content on a gesture enabled device, according to embodiments.
- a document may be viewed on the user interface 202 of a client device such as a mobile device, smart phone and/or tablet.
- the document may contain textual content 208 as well as other embedded content such as tables, graphics, slides, and audio/visual files. Often times when the document is viewed on the device, the document size and format may be optimized for viewing within the size constraints of the user interface 202 of the device.
- a table 204 included within the document may be larger than the size of the user interface 202 and may extend outside of the visible size constraints of the user interface 202 .
- a system may enable the textual content 208 to be separated from other content portions included within the document, such as the table 204 , such that each portion of the document as viewed on the user interface 202 may be separately controlled by a user.
- the user may be able to navigate through the textual content 208 without affecting the table 204 , and may additionally be able to navigate and scroll through the table 204 without affecting the textual content 208 .
- an indicator 212 may be provided for indicating to the user that additional portions of content may be available in the direction of the indicator 212 and prompting the user to scroll through the table 204 to view the additional content.
- the user interface 202 may be gesture enabled, such that the user may use a touch action 206 to navigate through portions of the document viewed on the user interface 202 of the client device.
- a touch action 206 such as a swipe or drag in the direction of the indicator 212 to navigate to the portions of the table 204 not currently visible.
- the user may swipe from left to right to display the overflowing table content.
- the user may use a touch action 206 , such as a tap on the indicator 212 to navigate to additional portions of the document not currently visible.
- the user may realize that as the document is currently viewed on the user interface 202 , a portion of the table 204 may extend outside of the size constraints of the user interface 202 .
- the user may execute a touch action 206 , such as a horizontal swipe in the direction of the indicator 212 , to scroll through the table 204 in order to display the remaining portions of the table 204 on the user interface.
- the user interface 210 may scroll through the table 204 and may display portions 214 of the table that were not displayed in the original user interface 202 .
- the textual content 208 that is not a part of the table 204 may remain fixed within the user interface 202 , such that the size and position of the textual content 208 may remain unchanged.
- the user may independently navigate and control textual content 208 using touch actions on the user interface, and while the textual content 208 may be scrolled and resized, the table 204 portion of the document may remain fixed within the user interface 202 , such that the size and position of the table 204 may remain unchanged.
- a single indicator may be used, left or right depending on a language of the system, without prompting the user to navigate in a particular direction.
- an additional indicator 216 may be provided in order to indicate that there may be additional portions of the table in another direction which may not be visible on the user interface 212 .
- the additional indicator 216 may indicate that there are portions of the table 204 available to the left of the table as currently viewed, while the indicator 212 on the right may indicate additional portions available to the right.
- indicators may be provided above and below the currently viewed table prompting the user to scroll up and down with vertical touch actions, as well as horizontally to the left and to the right.
- the user may be able to add formatting to each content portion individually without affecting the formatting of the other content portions of the document.
- the user may format the table to be right aligned while the other content is left aligned.
- the user may also be enabled to swipe from right to left (or left to right) in order to read the portions of the table that extend outside of the user interface without affecting the format of the textual content above and below the table.
- FIG. 3 illustrates an example user interface for individually resizing portions of content on a gesture enabled device, according to embodiments.
- a system may enable different content portions of a document viewed on a user interface 302 of a client device to be separately controlled on the user interface 302 .
- a user may use touch actions to navigate through textual content portions 308 of the document without affecting the size and position of other portions of the document.
- the user may scroll, pan, resize, and reformat a table 310 embedded within the document without affecting the size, position, and formatting of the textual content 308 of the document.
- the user may resize the table 310 in order to view the entire table 310 within the user interface 302 .
- the user may use expanding and pinching 314 touch actions to zoom in and out of the document table to magnify and reduce the size of the table 314 as it is displayed on the user interface.
- the textual content portions 308 may remain separate such that the resizing action on the table 310 does not operate to resize the textual content portions 308 .
- the textual content portions 308 that are not a part of the table 310 may remain fixed within the user interface 302 , such that the size and position of the textual content portions 308 may remain unchanged.
- FIG. 4 illustrates an example slide show presentation with accompanying textual content viewed on a user interface, according to embodiments.
- a user interface 402 on a client device may enable a user to view a document, such as a slide show presentation, on the client device. Due to the smaller size of some client device user interfaces, all of the contents of the document may not be displayed simultaneously, and the user may need to navigate through and resize portions of the document in order to optimally read and view the document.
- a slide image 404 may be displayed on the user interface 402 , and the device may also request the data for the textual content 408 of the slide for displaying along with the slide image 404 on the user interface 402 .
- the slide image 404 and the textual content 408 may be separated such that they may each be independently controlled and navigated by the user on the user interface.
- the slide image 404 and the textual content 408 may be displayed as a split screen on the user interface 402 such that the slide image split screen 412 and the textual content split screen 414 may be independent from each other and separately controlled by a user.
- the split screen may be positioned side by side and/or top and bottom depending on an orientation and size of the user interface 402 of the client device.
- the user may use touch actions such as a swipe, tap or drag on each split screen to control the content contained within the split screen. For example, the user may zoom in/out, resize, and pan the slide image 402 within slide image split screen 412 without affecting the size and position of the textual content 408 within the textual content split screen 414 . Additionally, the user interface may be configured to enable the user to swipe in a horizontal direction on the slide image split screen 412 for navigating to a new slide of the slide show presentation.
- touch actions such as a swipe, tap or drag on each split screen to control the content contained within the split screen.
- the user may zoom in/out, resize, and pan the slide image 402 within slide image split screen 412 without affecting the size and position of the textual content 408 within the textual content split screen 414 .
- the user interface may be configured to enable the user to swipe in a horizontal direction on the slide image split screen 412 for navigating to a new slide of the slide show presentation.
- the new slide image may be displayed on the slide show split screen 412 , and the client device may retrieve the accompanying textual content for the new slide and may update and display the new slide accompanying textual content in the textual content split screen 414 .
- the textual content 408 may be displayed in the textual content split screen 414 , all of the textual content 408 may not be viewable within the textual content split screen 414 .
- the user may perform a touch action 406 to scroll through the textual content 408 and to navigate to the portions of the textual content 408 which extend outside of the viewable area of the textual content touch screen 414 .
- the textual content 408 may overflow over the edge of the textual content touch screen 414 indicating to the user that there is additional textual content 418 and prompting the user to swipe further down to read through the textual content.
- the user may perform a swipe action on the textual content touch screen 414 in a vertical direction to display the overflowing additional textual content 418 within the textual content touch screen 414 .
- the user may also use a drag action to pan the textual content 408 in any direction in which the text may overflow.
- the slide image 404 may remain fixed in the slide image split screen 412 while the user scrolls through the accompanying text within the textual content touch screen 414 .
- the user interface 402 may provide an indicator for prompting the user to navigate to the additional textual content 418 .
- FIG. 5 illustrates an example split screen displaying a slide image and accompanying notes on a user interface, according to embodiments.
- a slide show presentation may be displayed on a user interface of a client device, and a slide image 504 may be displayed on the user interface 502 as well as textual content and additional notes 508 which may accompany the slide image 504 .
- the slideshow presentation may be displayed as a split screen on the user interface 502 such that the slide image split screen 512 and the textual content split screen 514 may be independent from each other and separately controlled by a user.
- the user may perform a touch action 506 , such as a swipe, in a horizontal direction on the slide image split screen 512 for navigating to a new slide within the slide show presentation. Additionally the user may perform pinching, expanding and panning actions to resize and reposition the slide image 504 within the slide image split screen 512 . While the user may resize and reposition the slide image 504 within the slide image split screen 512 , the size and position of the textual content and the notes 508 included within the textual content split screen 514 may remain unchanged.
- a touch action 506 such as a swipe
- the textual content that is contained within the slide image 504 may be initially displayed in the textual content split screen 514 .
- the user may swipe the textual content split screen 514 in a vertical direction to display overflowing textual content.
- notes 508 accompany the slide image 504
- the user may perform a touch action, such as a swipe, in a horizontal direction on the textual content split screen 514 to cause the notes 508 to be displayed.
- All of the content of the notes 508 may not be viewable within the textual content split screen 514 , and the user may also perform a touch action 506 within the textual content split screen 514 to navigate to the portions of the notes 508 which extend outside of the viewable area of the textual content touch screen 514 .
- the text of the notes 508 may overflow over the edge of the textual content touch screen 514 indicating to the user that there is additional textual content 518 and prompting the user to swipe further down to read through the textual content.
- the user may perform a swipe action in a vertical direction to display the overflowing notes 508 within the textual content touch screen 514 , and may additionally perform a drag action to pan the notes 508 in any direction. Additionally the user may perform pinching and expanding actions to resize the textual content and notes included within the textual content split screen 514 to optimize the display of the textual content and notes. While the user navigates through the textual content and the notes 508 included within the textual content split screen 514 , the slide number, and size and position of the slide image may remain unchanged.
- FIG. 1 through FIG. 5 are for illustration purposes only and do not constitute a limitation on embodiments.
- a system for separating portions of document contents into individually controlled sections on a user interface may be implemented with other user interfaces, interface elements, presentations, and configurations using the principles described herein.
- FIG. 6 is an example networked environment, where embodiments may be implemented.
- a system for separating portions of document contents into individually controlled sections on a user interface of a client device may be implemented via software executed over one or more servers 614 such as a hosted service.
- the platform may communicate with client applications on individual computing devices such as a smart phone 613 , a laptop computer 612 , or desktop computer 611 (‘client devices’) through network(s) 610 .
- client devices desktop computer 611
- Client applications executed on any of the client devices 611 - 613 may facilitate communications via application(s) executed by servers 614 , or on individual server 614 .
- An application executed on one of the servers may facilitate separating portions of document contents into individually controlled sections on a user interface of a client device.
- the application may retrieve relevant data from data store(s) 615 directly or through database server 618 , and provide requested services (e.g. document editing) to the user(s) through client devices 611 - 613 .
- Network(s) 610 may comprise any topology of servers, clients, Internet service providers, and communication media.
- a system according to embodiments may have a static or dynamic topology.
- Network(s) 610 may include secure networks such as an enterprise network, an unsecure network such as a wireless open network, or the Internet.
- Network(s) 610 may also coordinate communication over other networks such as Public Switched Telephone Network (PSTN) or cellular networks.
- PSTN Public Switched Telephone Network
- network(s) 610 may include short range wireless networks such as Bluetooth or similar ones.
- Network(s) 610 provide communication between the nodes described herein.
- network(s) 610 may include wireless media such as acoustic, RF, infrared and other wireless media.
- FIG. 7 and the associated discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented.
- computing device 700 may be any computing device executing an application for of separating portions of document contents into individually controlled sections on a user interface of a client device according to embodiments and include at least one processing unit 702 and system memory 704 .
- Computing device 700 may also include a plurality of processing units that cooperate in executing programs.
- system memory 704 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two.
- System memory 704 typically includes an operating system 705 suitable for controlling the operation of the platform, such as the WINDOWS® operating systems from MICROSOFT CORPORATION of Redmond, Wash.
- the system memory 704 may also include one or more software applications such as an application 724 and gesture detection module 726 .
- the application 724 may facilitate recognizing separate content portions of a document that is viewed on a gesture enabled client device such as a smart phone, mobile device, and/or tablet and separating the content portions into individually controlled portions on the client device.
- the application 724 may enable a computing device 700 to detect a document that is viewed on the user interface of a client device such as a smart phone, and to identify different content portions of the document, such as textual content, tables, graphics, slide images, and audio/visual files.
- the application 724 may facilitate separating each content portion so that each content portion can be separately and individually controlled by a user without affecting the other content portions.
- application 724 may detect a touch action on a select content portion of the document on the user interface of the client device.
- the gesture detection module 726 may enable the user to resize, reposition, reformat and scroll through the selected content portion without affecting the size and position of the rest of the document.
- the application 724 and gesture detection module 726 may be separate applications or integrated modules of a hosted service. This basic configuration is illustrated in FIG. 7 by those components within dashed line 708 .
- Computing device 700 may have additional features or functionality.
- the computing device 700 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
- additional storage is illustrated in FIG. 7 by removable storage 709 and non-removable storage 710 .
- Computer readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- System memory 704 , removable storage 709 and non-removable storage 710 are all examples of computer readable storage media.
- Computer readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 700 . Any such computer readable storage media may be part of computing device 700 .
- Computing device 700 may also have input device(s) 712 such as keyboard, mouse, pen, voice input device, touch input device, and comparable input devices.
- Output device(s) 714 such as a display, speakers, printer, and other types of output devices may also be included. These devices are well known in the art and need not be discussed at length here.
- Computing device 700 may also contain communication connections 716 that allow the device to communicate with other devices 718 , such as over a wired or wireless network in a distributed computing environment, a satellite link, a cellular link, a short range network, and comparable mechanisms.
- Other devices 718 may include computer device(s) that execute communication applications, web servers, and comparable devices.
- Communication connection(s) 716 is one example of communication media.
- Communication media can include therein computer readable instructions, data structures, program modules, or other data.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
- Example embodiments also include methods. These methods can be implemented in any number of ways, including the structures described in this document. One such way is by machine operations, of devices of the type described in this document.
- Another optional way is for one or more of the individual operations of the methods to be performed in conjunction with one or more human operators performing some. These human operators need not be collocated with each other, but each can be only with a machine that performs a portion of the program.
- FIG. 8 illustrates a logic flow diagram for a process of separating portions of document contents into individually controlled sections on a user interface of a client device, according to embodiments.
- Process 800 may be implemented on a computing device or similar electronic device capable of executing instructions through a processor.
- Process 800 begins with operation 810 , where the system may detect a document viewed on the user interface of the client device.
- the client device may be a mobile device, smart phone, and/or tablet having a restricted size user interface.
- the system may identify different individual content portions of the document viewed on the client device.
- the individual content portions may include textual content, tables, graphics, slides, and embedded audio/visual files.
- the system may separate each of the individual content portions into individually controlled sections.
- the system may separate the sections via coding behind the document, and additionally the system may separate the sections visually by providing a split screen, which may be side by side and/or top and bottom.
- Each section of content may be independent from the other content portions such that they may be navigated and controlled by a user without affecting other content portions.
- the system may detect a touch action (or comparable input such as optically captured gesture, pen input, voice input, eye-tracking, etc.) on a selected content portion.
- a touch action may include a tap, swipe, pinch, expand, and/or drag on the content included within the selected content portion in order to scroll, resize, and reposition the selected content portion.
- the system may enable the user to control the selected content portion without affecting the remaining content portions of the document. While the user uses touch actions to scroll, resize, and reposition the selected content portion, the remaining content portions may remain fixed within the user interface of the client device, such that the size and position of the remaining content portions do not change and are not affected by the user's actions on the selected content portion.
- process 800 is for illustration purposes. Separating portions of document contents into individually controlled sections on a user interface of a client device may be implemented by similar processes with fewer or additional steps, as well as in different order of operations using the principles described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Portions of document contents are separated into individually controlled sections on a user interface of a smaller size client device display. A document viewed on a mobile device may include different content portions such as textual content, tables, slides and graphics. Due to a smaller user interface of the mobile device, some portions of the content may extend outside of the user interface and may not all be visible at the same time. The user may use gestures to scroll through and resize the document to view all of the contents. The system may separate each of the different content portions into individual sections and enable the user to control each section separately, such that the user may navigate, resize, and reposition each individual section without affecting the size and position of the remaining sections of the document for optimally viewing the document on the user interface.
Description
- Modern communication systems may enable a user to view a document on a client device such as a mobile device, smart phone, tablet, or other personal computing device. Often times a mobile device may have a smaller user interface as compared with display screens of larger computing devices, and when the document is viewed on the smaller user interface of the mobile device, the entire contents of the document may not all be visible at the same time. Sometimes only a portion of the document may be viewed at a time, and the remaining contents may extend outside of the viewing area of the user interface.
- The user may resize a portion of the document, such as a table or textual content, in order to optimally view the entire contents of a portion of the document on the user interface, and the resizing of the portion of the document may cause the entire document to be resized resulting in certain portions of the document to extend outside of the viewing window of the user interface where they are not visible and/or becoming an unreadable size.
- This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to exclusively identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
- Embodiments are directed to optimizing view of documents on relatively smaller mobile device displays. In some embodiments, a table inserted into a document such as a word processing document may be displayed in a larger size than the content around it for ease of viewability. A user may be enabled to move the table in a horizontal direction through swipe actions while the content around the table is preserved. In other embodiments, textual content of a presentation slide may be displayed along with the slide (above or below the slide) on the mobile device display. A user may be enabled to navigate the slide through a horizontal swipe action, while the textual content may be scrolled through a vertical swipe action.
- These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory and do not restrict aspects as claimed.
-
FIG. 1 illustrates an example networked environment where embodiments may be implemented; -
FIG. 2 illustrates an example user interface for optimally displaying textual and other content on a gesture enabled device, according to embodiments; -
FIG. 3 illustrates an example user interface for individually resizing portions of content on a gesture enabled device, according to embodiments; -
FIG. 4 illustrates an example slide show presentation with accompanying textual content viewed on a user interface, according to embodiments; -
FIG. 5 illustrates an example split screen displaying a slide image and accompanying notes on a user interface, according to embodiments; -
FIG. 6 is a networked environment, where a system according to embodiments may be implemented; -
FIG. 7 is a block diagram of an example computing operating environment, where embodiments may be implemented; and -
FIG. 8 illustrates a logic flow diagram for a process of separating portions of document contents into individually controlled sections on a user interface of a client device, according to embodiments. - As briefly described above, portions of document contents may be separated into individually controlled sections on a user interface of a small form client device. A table inserted into a document such as a word processing document may be displayed in a larger size than the content around it for ease of viewability. A user may be enabled to move the table in a horizontal direction through swipe actions while the content around the table is preserved. Textual content of a presentation slide may also be displayed along with the slide (above or below the slide) on the mobile device display. A user may be enabled to navigate the slide through a horizontal swipe action, while the textual content may be scrolled through a vertical swipe action.
- A document may include different content portions such as textual content, tables, slides and graphics. The document size may be optimized for viewing on the smaller interface of the client device. However, the optimum viewing size of a textual content portion may not be the optimum viewing size for other content portions such as tables and slides.
- In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These aspects may be combined, other aspects may be utilized, and structural changes may be made without departing from the spirit or scope of the present disclosure. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims and their equivalents.
- While the embodiments will be described in the general context of program modules that execute in conjunction with an application program that runs on an operating system on a computing device, those skilled in the art will recognize that aspects may also be implemented in combination with other program modules.
- Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that embodiments may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and comparable computing devices. Embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
- Embodiments may be implemented as a computer-implemented process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage medium readable by a computer system and encoding a computer program that comprises instructions for causing a computer or computing system to perform example process(es). The computer-readable storage medium can for example be implemented via one or more of a volatile computer memory, a non-volatile memory, a hard drive, a flash drive, a floppy disk, or compact servers, an application executed on a single computing device, and comparable systems. The term “server” generally refers to a computing device executing one or more software programs typically in a networked environment. However, a server may also be implemented as a virtual server (software programs) executed on one or more computing devices viewed as a server on the network. More detail on these technologies and example operations is provided below.
- Throughout this specification, the term “platform” may be a combination of software and hardware components for separating portions of document contents into individually controlled sections on a user interface. Examples of platforms include, but are not limited to, a hosted service executed over a plurality of servers, an application executed on a single computing device, and comparable systems. The term “server” generally refers to a computing device executing one or more software programs typically in a networked environment. However, a server may also be implemented as a virtual server (software programs) executed on one or more computing devices viewed as a server on the network. More detail on these technologies and example operations is provided below.
- According to some embodiments, a gesture enabled input device and display screen may be utilized for viewing documents and receiving input from a user over a user interface. The gesture enabled input device and display screen may utilize any technology that allows touch input by a user to be recognized. For example, some technologies may include, but are not limited to: heat, finger pressure, high capture rate cameras, infrared light, optic capture, tuned electromagnetic induction, ultrasonic receivers, transducer microphones, laser rangefinders, shadow capture, and the like. The user interface of a gesture enabled device may display content and documents associated with word processing applications, presentation applications, spreadsheet applications and web page content, and menus of actions for interacting with the displayed content. A user may use touch input actions to interact with the user interface to access, view and edit the content, such as documents, tables, spreadsheets, charts, lists, and any content (e.g., audio, video, etc.). The gesture enabled input device may make use of features specific to touch or gesture enabled computing devices, but may also work with a traditional mouse and keyboard. A touch input action, such as a tap or swipe action as used herein may be provided by a user through a finger, a pen, a mouse, or similar device, as well as through predefined keyboard entry combinations, eye-tracking, and/or a voice command.
-
FIG. 1 illustrates an example networked environment where embodiments may be implemented. In an example networked environment, a user may utilize one or more devices, such asmobile devices 108,smart phones tablets 102, and otherpersonal computing devices 114 to view documents, applications and web pages over the network. The devices such assmart phones 106 andmobile devices 108 may have small memory capacity and low processing or CPU speed compared with larger computing devices, and may operate in conjunction with a remote server over acloud 110 for retrieving the data associated with a document viewed on the device. For example, aremote server 112 residing in thecloud 110 may store, maintain, and manage application data for applications and documents executed on the client device, and through a browser executed on the client device, the client device may connect to the cloud for retrieving requested application data and loading the data onto the client device. The client device may connect to theserver 112 on thecloud 110 via a networked environment, which may be a wireless or wired network as some examples, and likewise the application data may be delivered from theserver 112 residing on thecloud 110 to the client device via the networked environment. - When a user opens an application, document, and/or web application on the client device, such as a
mobile device 108,tablet 102, and/orsmartphone server 112 residing in thecloud 110. The retrieved data may be retrieved and formatted for display on the client device in a format especially optimized for the size of the user interface of the client device. Often times the client device, including amobile device 108,tablet 102, and/orsmartphone - In an example embodiment, on a gesture enabled device, a user may use a touch action or an optically captured gesture such as a tap, drag, and/or swipe to scroll through the document and navigate to subsequent pages and portions of the document. The user may also use pinching and expanding touch actions to zoom in and out of the document and resize the document as it is displayed on the user interface. Additionally, a document may include additional elements and content portions along with textual content, such as tables, graphics, slides, and embedded audio/visual files. In an example scenario, the user may resize a portion of the document, such as a table, in order to optimally view the selected portion, and the resizing of the portion may cause the entire document to be resized resulting in certain portions of the document may extending outside of the viewing window of the user interface and becoming unreadable.
- Often times, the optimal size of the textual content for viewing on the user interface of the client device may not be an optimal size for viewing the additional elements, such as embedded tables, graphics, slides, and audio/visual files. For example, when the textual content is magnified to an optimal viewing size, a table included in the document may be magnified as well, such that the portions of the table extend outside of the user interface and are not viewable on the user interface. If the user resizes the table to make the entire table visible on the user interface, then the textual content of the document may be resized to an unreadable size. In another example, when a user views a slide show presentation on the client device, the user may optimize the size of slide for viewing on the user interface, causing textual content associated with the slide such as accompanying notes, not to be optimally visible on the user interface. The user may have to continually resize portions of the document, and pan and scroll through the document in order to effectively view all of the contents of the document.
-
FIG. 2 illustrates an example user interface for optimally displaying textual and other content on a gesture enabled device, according to embodiments. A document may be viewed on theuser interface 202 of a client device such as a mobile device, smart phone and/or tablet. The document may containtextual content 208 as well as other embedded content such as tables, graphics, slides, and audio/visual files. Often times when the document is viewed on the device, the document size and format may be optimized for viewing within the size constraints of theuser interface 202 of the device. As demonstrated in diagram 200, when the size of the document is resized to optimize the display of thetextual content 208 on theuser interface 202, a table 204 included within the document may be larger than the size of theuser interface 202 and may extend outside of the visible size constraints of theuser interface 202. - A system according to embodiments may enable the
textual content 208 to be separated from other content portions included within the document, such as the table 204, such that each portion of the document as viewed on theuser interface 202 may be separately controlled by a user. For example, the user may be able to navigate through thetextual content 208 without affecting the table 204, and may additionally be able to navigate and scroll through the table 204 without affecting thetextual content 208. When the document is viewed on the user interface with a portion of the table 204 extending outside of the size constraints of theuser interface 202, anindicator 212 may be provided for indicating to the user that additional portions of content may be available in the direction of theindicator 212 and prompting the user to scroll through the table 204 to view the additional content. - In an example embodiment, the
user interface 202 may be gesture enabled, such that the user may use atouch action 206 to navigate through portions of the document viewed on theuser interface 202 of the client device. When the document is viewed on theuser interface 202 with a portion of the table 204 overflowing or extending outside of the size constraints of theuser interface 202, the user may use atouch action 206 such as a swipe or drag in the direction of theindicator 212 to navigate to the portions of the table 204 not currently visible. For example, the user may swipe from left to right to display the overflowing table content. Additionally, the user may use atouch action 206, such as a tap on theindicator 212 to navigate to additional portions of the document not currently visible. - As demonstrated in diagram 200, the user may realize that as the document is currently viewed on the
user interface 202, a portion of the table 204 may extend outside of the size constraints of theuser interface 202. The user may execute atouch action 206, such as a horizontal swipe in the direction of theindicator 212, to scroll through the table 204 in order to display the remaining portions of the table 204 on the user interface. When the user executes the swipe action on the table 204, theuser interface 210 may scroll through the table 204 and may displayportions 214 of the table that were not displayed in theoriginal user interface 202. While the user scrolls through the table 204, thetextual content 208 that is not a part of the table 204 may remain fixed within theuser interface 202, such that the size and position of thetextual content 208 may remain unchanged. Likewise, the user may independently navigate and controltextual content 208 using touch actions on the user interface, and while thetextual content 208 may be scrolled and resized, the table 204 portion of the document may remain fixed within theuser interface 202, such that the size and position of the table 204 may remain unchanged. - In some examples, a single indicator may be used, left or right depending on a language of the system, without prompting the user to navigate in a particular direction. In additional embodiments, when the user scrolls through the table 204 to display
portions 214 of the table that were not originally visible, anadditional indicator 216 may be provided in order to indicate that there may be additional portions of the table in another direction which may not be visible on theuser interface 212. For example theadditional indicator 216 may indicate that there are portions of the table 204 available to the left of the table as currently viewed, while theindicator 212 on the right may indicate additional portions available to the right. In another embodiment, if portions of the table are available above and/or below the currently viewed table, then indicators may be provided above and below the currently viewed table prompting the user to scroll up and down with vertical touch actions, as well as horizontally to the left and to the right. - In a further embodiment, the user may be able to add formatting to each content portion individually without affecting the formatting of the other content portions of the document. For example, the user may format the table to be right aligned while the other content is left aligned. The user may also be enabled to swipe from right to left (or left to right) in order to read the portions of the table that extend outside of the user interface without affecting the format of the textual content above and below the table.
-
FIG. 3 illustrates an example user interface for individually resizing portions of content on a gesture enabled device, according to embodiments. As previously discussed in conjunction withFIG. 2 , a system according to embodiments may enable different content portions of a document viewed on auser interface 302 of a client device to be separately controlled on theuser interface 302. For example, on a gesture enabled device, a user may use touch actions to navigate throughtextual content portions 308 of the document without affecting the size and position of other portions of the document. Additionally, the user may scroll, pan, resize, and reformat a table 310 embedded within the document without affecting the size, position, and formatting of thetextual content 308 of the document. - In an example embodiment, on a gesture enabled device, if the table 310 is originally displayed such that portions of the table may overflow or extend outside the viewing area of the
user interface 302 of the document, the user may resize the table 310 in order to view the entire table 310 within theuser interface 302. The user may use expanding and pinching 314 touch actions to zoom in and out of the document table to magnify and reduce the size of the table 314 as it is displayed on the user interface. As the user magnifies resized the table 310 to optimally view the table 310 within theuser interface 302 of the client device, thetextual content portions 308 may remain separate such that the resizing action on the table 310 does not operate to resize thetextual content portions 308. Thetextual content portions 308 that are not a part of the table 310 may remain fixed within theuser interface 302, such that the size and position of thetextual content portions 308 may remain unchanged. -
FIG. 4 illustrates an example slide show presentation with accompanying textual content viewed on a user interface, according to embodiments. As previously described, auser interface 402 on a client device may enable a user to view a document, such as a slide show presentation, on the client device. Due to the smaller size of some client device user interfaces, all of the contents of the document may not be displayed simultaneously, and the user may need to navigate through and resize portions of the document in order to optimally read and view the document. - In a system according to embodiments, when a slide show presentation is viewed on the client device, a
slide image 404 may be displayed on theuser interface 402, and the device may also request the data for thetextual content 408 of the slide for displaying along with theslide image 404 on theuser interface 402. Theslide image 404 and thetextual content 408 may be separated such that they may each be independently controlled and navigated by the user on the user interface. In an example embodiment, theslide image 404 and thetextual content 408 may be displayed as a split screen on theuser interface 402 such that the slide image split screen 412 and the textual content splitscreen 414 may be independent from each other and separately controlled by a user. The split screen may be positioned side by side and/or top and bottom depending on an orientation and size of theuser interface 402 of the client device. - In an example embodiment, the user may use touch actions such as a swipe, tap or drag on each split screen to control the content contained within the split screen. For example, the user may zoom in/out, resize, and pan the
slide image 402 within slide image split screen 412 without affecting the size and position of thetextual content 408 within the textual content splitscreen 414. Additionally, the user interface may be configured to enable the user to swipe in a horizontal direction on the slide image split screen 412 for navigating to a new slide of the slide show presentation. When the user navigates to a new slide of the slide show presentation, the new slide image may be displayed on the slide show split screen 412, and the client device may retrieve the accompanying textual content for the new slide and may update and display the new slide accompanying textual content in the textual content splitscreen 414. - In an additional embodiment, when the
slide image 404 is viewed in the slide image split screen 412, and thetextual content 408 may be displayed in the textual content splitscreen 414, all of thetextual content 408 may not be viewable within the textual content splitscreen 414. The user may perform atouch action 406 to scroll through thetextual content 408 and to navigate to the portions of thetextual content 408 which extend outside of the viewable area of the textualcontent touch screen 414. In an example scenario, thetextual content 408 may overflow over the edge of the textualcontent touch screen 414 indicating to the user that there is additionaltextual content 418 and prompting the user to swipe further down to read through the textual content. The user may perform a swipe action on the textualcontent touch screen 414 in a vertical direction to display the overflowing additionaltextual content 418 within the textualcontent touch screen 414. The user may also use a drag action to pan thetextual content 408 in any direction in which the text may overflow. Theslide image 404 may remain fixed in the slide image split screen 412 while the user scrolls through the accompanying text within the textualcontent touch screen 414. Additionally theuser interface 402 may provide an indicator for prompting the user to navigate to the additionaltextual content 418. -
FIG. 5 illustrates an example split screen displaying a slide image and accompanying notes on a user interface, according to embodiments. In an example embodiment, a slide show presentation may be displayed on a user interface of a client device, and aslide image 504 may be displayed on theuser interface 502 as well as textual content andadditional notes 508 which may accompany theslide image 504. The slideshow presentation may be displayed as a split screen on theuser interface 502 such that the slideimage split screen 512 and the textual content splitscreen 514 may be independent from each other and separately controlled by a user. - In an example embodiment, the user may perform a
touch action 506, such as a swipe, in a horizontal direction on the slideimage split screen 512 for navigating to a new slide within the slide show presentation. Additionally the user may perform pinching, expanding and panning actions to resize and reposition theslide image 504 within the slideimage split screen 512. While the user may resize and reposition theslide image 504 within the slideimage split screen 512, the size and position of the textual content and thenotes 508 included within the textual content splitscreen 514 may remain unchanged. - When the
slide image 504 is viewed in the slideimage split screen 512, the textual content that is contained within theslide image 504 may be initially displayed in the textual content splitscreen 514. As previously discussed, the user may swipe the textual content splitscreen 514 in a vertical direction to display overflowing textual content. - In a further embodiment, if
notes 508 accompany theslide image 504, the user may perform a touch action, such as a swipe, in a horizontal direction on the textual content splitscreen 514 to cause thenotes 508 to be displayed. All of the content of thenotes 508 may not be viewable within the textual content splitscreen 514, and the user may also perform atouch action 506 within the textual content splitscreen 514 to navigate to the portions of thenotes 508 which extend outside of the viewable area of the textualcontent touch screen 514. In an example scenario, the text of thenotes 508 may overflow over the edge of the textualcontent touch screen 514 indicating to the user that there is additional textual content 518 and prompting the user to swipe further down to read through the textual content. The user may perform a swipe action in a vertical direction to display the overflowing notes 508 within the textualcontent touch screen 514, and may additionally perform a drag action to pan thenotes 508 in any direction. Additionally the user may perform pinching and expanding actions to resize the textual content and notes included within the textual content splitscreen 514 to optimize the display of the textual content and notes. While the user navigates through the textual content and thenotes 508 included within the textual content splitscreen 514, the slide number, and size and position of the slide image may remain unchanged. - The example user interface elements and interactions discussed in
FIG. 1 throughFIG. 5 are for illustration purposes only and do not constitute a limitation on embodiments. A system for separating portions of document contents into individually controlled sections on a user interface may be implemented with other user interfaces, interface elements, presentations, and configurations using the principles described herein. -
FIG. 6 is an example networked environment, where embodiments may be implemented. A system for separating portions of document contents into individually controlled sections on a user interface of a client device may be implemented via software executed over one ormore servers 614 such as a hosted service. The platform may communicate with client applications on individual computing devices such as asmart phone 613, alaptop computer 612, or desktop computer 611 (‘client devices’) through network(s) 610. - Client applications executed on any of the client devices 611-613 may facilitate communications via application(s) executed by
servers 614, or onindividual server 614. An application executed on one of the servers may facilitate separating portions of document contents into individually controlled sections on a user interface of a client device. The application may retrieve relevant data from data store(s) 615 directly or throughdatabase server 618, and provide requested services (e.g. document editing) to the user(s) through client devices 611-613. - Network(s) 610 may comprise any topology of servers, clients, Internet service providers, and communication media. A system according to embodiments may have a static or dynamic topology. Network(s) 610 may include secure networks such as an enterprise network, an unsecure network such as a wireless open network, or the Internet. Network(s) 610 may also coordinate communication over other networks such as Public Switched Telephone Network (PSTN) or cellular networks. Furthermore, network(s) 610 may include short range wireless networks such as Bluetooth or similar ones. Network(s) 610 provide communication between the nodes described herein. By way of example, and not limitation, network(s) 610 may include wireless media such as acoustic, RF, infrared and other wireless media.
- Many other configurations of computing devices, applications, data sources, and data distribution systems may be employed to implement a platform for separating portions of document contents into individually controlled sections on a user interface of a client device. Furthermore, the networked environments discussed in
FIG. 6 are for illustration purposes only. Embodiments are not limited to the example applications, modules, or processes. -
FIG. 7 and the associated discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented. With reference toFIG. 7 , a block diagram of an example computing operating environment for an application according to embodiments is illustrated, such ascomputing device 700. In a basic configuration,computing device 700 may be any computing device executing an application for of separating portions of document contents into individually controlled sections on a user interface of a client device according to embodiments and include at least oneprocessing unit 702 andsystem memory 704.Computing device 700 may also include a plurality of processing units that cooperate in executing programs. Depending on the exact configuration and type of computing device, thesystem memory 704 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two.System memory 704 typically includes anoperating system 705 suitable for controlling the operation of the platform, such as the WINDOWS® operating systems from MICROSOFT CORPORATION of Redmond, Wash. Thesystem memory 704 may also include one or more software applications such as anapplication 724 andgesture detection module 726. - The
application 724 may facilitate recognizing separate content portions of a document that is viewed on a gesture enabled client device such as a smart phone, mobile device, and/or tablet and separating the content portions into individually controlled portions on the client device. Theapplication 724 may enable acomputing device 700 to detect a document that is viewed on the user interface of a client device such as a smart phone, and to identify different content portions of the document, such as textual content, tables, graphics, slide images, and audio/visual files. Theapplication 724 may facilitate separating each content portion so that each content portion can be separately and individually controlled by a user without affecting the other content portions. Through thegesture detection module 726,application 724 may detect a touch action on a select content portion of the document on the user interface of the client device. Thegesture detection module 726 may enable the user to resize, reposition, reformat and scroll through the selected content portion without affecting the size and position of the rest of the document. Theapplication 724 andgesture detection module 726 may be separate applications or integrated modules of a hosted service. This basic configuration is illustrated inFIG. 7 by those components within dashedline 708. -
Computing device 700 may have additional features or functionality. For example, thecomputing device 700 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated inFIG. 7 byremovable storage 709 andnon-removable storage 710. Computer readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.System memory 704,removable storage 709 andnon-removable storage 710 are all examples of computer readable storage media. Computer readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computingdevice 700. Any such computer readable storage media may be part ofcomputing device 700.Computing device 700 may also have input device(s) 712 such as keyboard, mouse, pen, voice input device, touch input device, and comparable input devices. Output device(s) 714 such as a display, speakers, printer, and other types of output devices may also be included. These devices are well known in the art and need not be discussed at length here. -
Computing device 700 may also containcommunication connections 716 that allow the device to communicate withother devices 718, such as over a wired or wireless network in a distributed computing environment, a satellite link, a cellular link, a short range network, and comparable mechanisms.Other devices 718 may include computer device(s) that execute communication applications, web servers, and comparable devices. Communication connection(s) 716 is one example of communication media. Communication media can include therein computer readable instructions, data structures, program modules, or other data. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. - Example embodiments also include methods. These methods can be implemented in any number of ways, including the structures described in this document. One such way is by machine operations, of devices of the type described in this document.
- Another optional way is for one or more of the individual operations of the methods to be performed in conjunction with one or more human operators performing some. These human operators need not be collocated with each other, but each can be only with a machine that performs a portion of the program.
-
FIG. 8 illustrates a logic flow diagram for a process of separating portions of document contents into individually controlled sections on a user interface of a client device, according to embodiments.Process 800 may be implemented on a computing device or similar electronic device capable of executing instructions through a processor. -
Process 800 begins withoperation 810, where the system may detect a document viewed on the user interface of the client device. In an example embodiment, the client device may be a mobile device, smart phone, and/or tablet having a restricted size user interface. Atoperation 820, the system may identify different individual content portions of the document viewed on the client device. The individual content portions may include textual content, tables, graphics, slides, and embedded audio/visual files. Atoperation 830, the system may separate each of the individual content portions into individually controlled sections. The system may separate the sections via coding behind the document, and additionally the system may separate the sections visually by providing a split screen, which may be side by side and/or top and bottom. Each section of content may be independent from the other content portions such that they may be navigated and controlled by a user without affecting other content portions. - At
operation 840, the system may detect a touch action (or comparable input such as optically captured gesture, pen input, voice input, eye-tracking, etc.) on a selected content portion. A touch action may include a tap, swipe, pinch, expand, and/or drag on the content included within the selected content portion in order to scroll, resize, and reposition the selected content portion. Atoperation 850, the system may enable the user to control the selected content portion without affecting the remaining content portions of the document. While the user uses touch actions to scroll, resize, and reposition the selected content portion, the remaining content portions may remain fixed within the user interface of the client device, such that the size and position of the remaining content portions do not change and are not affected by the user's actions on the selected content portion. - The operations included in
process 800 are for illustration purposes. Separating portions of document contents into individually controlled sections on a user interface of a client device may be implemented by similar processes with fewer or additional steps, as well as in different order of operations using the principles described herein. - The above specification, examples and data provide a complete description of the manufacture and use of the composition of the embodiments. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims and embodiments.
Claims (21)
1. A method executed at least in part in a computing device for separating portions of document contents into individually controlled sections on a user interface of a client device, the method comprising:
displaying a document on the user interface of a client device;
identifying one or more different content portions of the document;
separating the different content portions into individually controlled sections; and
enabling distinct navigation actions on the individually controlled sections.
2. The method of claim 1 , wherein identifying one or more different content portions of the document further comprises:
identifying one or more of: textual content, a table, an embedded video, an image, and a graphic.
3. The method of claim 1 , wherein the client device is a gesture enabled client device accepting one of a touch action and an optically captured gesture action.
4. The method of claim 3 , further comprising:
if the document is a presentation document, displaying a slide image in a first portion of a split screen; and
displaying textual content of the slide in a second portion of the split screen such that the textual content is readily legible.
5. The method of claim 4 , further comprising:
displaying textual content accompanying the slide in the second portion in addition or in place of the textual content of the slide.
6. The method of claim 4 , further comprising:
enabling the user to perform a horizontal swipe action on the first portion of the split screen;
upon detection of the horizontal swipe action, scrolling to a new slide in a direction of the horizontal swipe action; and
modifying the textual content in the second portion of the split screen to match the new slide.
7. The method of claim 4 , further comprising:
enabling the user to perform a vertical swipe action on the second portion of the split screen; and
upon detection of the vertical swipe action, scrolling the textual content in a direction of the vertical swipe action.
8. The method of claim 4 , further comprising:
detecting the touch action on one of the first and second portions of the split screen; and
enabling the user to perform at least one from a set of: scroll, resize, and reposition content on one of the first and second portions of the split screen.
9. The method of claim 3 , further comprising:
if the document includes an embedded table, displaying a portion of the table to fit a width of a client device display such that contents of the table are readily legible; and
resizing remaining content of the document displayed above and below the table to fit the width of the client device display.
10. The method of claim 9 , further comprising:
upon detection of a horizontal swipe action on the displayed table, scrolling the table in a direction of the horizontal swipe action; and
preserving the remaining content of the document displayed above and below the table such that a reading experience is maintained.
11. The method of claim 9 , further comprising:
aligning the displayed table and the remaining content of the document displayed above and below the table one of left-to-right and right-to-left based on a language of the document.
11. (canceled)
12. The method of claim 2 , further comprising:
automatically selecting sizes of a first and a second portions of a client device display for displaying the different content portions; and
enabling a user to modify the sizes of the first and second portions.
13. The method of claim 1 , wherein the client device is one of: a smart phone, a tablet, a handheld computer, and a vehicle mount computer.
14. A client device for separating portions of document contents into individually controlled sections on a display of the client device, the client device comprising:
a memory storing instructions;
a processor coupled to the memory, the processor executing an application displaying content in conjunction with the stored instructions, wherein the application is configured to:
display a document on a user interface of a client device;
identify one or more different content portions of the document comprising a textual content, a table, an embedded video, an image, and a graphic;
separate the different content portions into individually controlled sections;
display the different content portions on distinct portions of the display; and
enable distinct navigation actions on the distinct portions of the display.
15. The client device of claim 14 , wherein the application is further configured to:
if the document is a presentation document,
display a slide image in a first portion of a split screen, and
display one or more of textual content of the slide and textual content accompanying the slide in a second portion of the split screen such that the textual content is readily legible; and
if the document includes an embedded table,
display a portion of the table to fit a width of a client device display such that contents of the table are readily legible, and
resize remaining content of the document displayed above and below the table to fit the width of the client device display.
16. The client device of claim 15 , wherein the application is further configured to:
upon detection of a horizontal swipe action on the first portion, scroll to a new slide in a direction of the horizontal swipe action and modify the textual content in the second portion of the split screen to match the new slide;
upon detection of a vertical swipe action on the second portion, scroll the textual content in a direction of the vertical swipe action; and
upon detection of a horizontal swipe action on the displayed table, scroll the table in a direction of the horizontal swipe action and preserve the remaining content of the document displayed above and below the table such that a reading experience is maintained.
17. The client device of claim 15 , wherein the application is further configured to:
identify when additional portions of displayed content portion extends outside of a viewing window of the user interface; and
provide an indicator for alerting the user that additional portions of the displayed content portion are available in a direction of the indicator.
18. The client device of claim 17 , wherein the table is maintained as a separate entity from remaining content of the displayed document to enable distinct navigation actions on the table while a display the remaining content is preserved.
19. A computer-readable memory device with instructions stored thereon for separating portions of document contents into individually controlled sections on a user interface of a client device, the instructions comprising:
displaying a document on a user interface of a client device;
if the document is a presentation document,
displaying a slide image in a first portion of a split screen,
displaying one or more of textual content of the slide and textual content accompanying the slide in a second portion of the split screen such that the textual content is readily legible
upon detection of a horizontal swipe action on the first portion, scrolling to a new slide in a direction of the horizontal swipe action and modifying the textual content in the second portion of the split screen to match the new slide, and
upon detection of a vertical swipe action on the second portion, scrolling the textual content in a direction of the vertical swipe action; and
if the document includes an embedded table,
displaying a portion of the table to fit a width of a client device display such that contents of the table are readily legible,
resizing remaining content of the document displayed above and below the table to fit the width of the client device display,
upon detection of a horizontal swipe action on the displayed table, scroll the table in a direction of the horizontal swipe action and preserve the remaining content of the document displayed above and below the table such that a reading experience is maintained.
20. The computer-readable memory device of claim 19 , wherein the instructions further comprise:
enabling one or more of resizing, panning, expanding, and shrinking of the table without modifying a size and position of the remaining content above and below the table.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/524,175 US20130339830A1 (en) | 2012-06-15 | 2012-06-15 | Optimized document views for mobile device interfaces |
US14/833,083 US10867117B2 (en) | 2012-06-15 | 2015-08-22 | Optimized document views for mobile device interfaces |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/524,175 US20130339830A1 (en) | 2012-06-15 | 2012-06-15 | Optimized document views for mobile device interfaces |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/833,083 Continuation US10867117B2 (en) | 2012-06-15 | 2015-08-22 | Optimized document views for mobile device interfaces |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130339830A1 true US20130339830A1 (en) | 2013-12-19 |
Family
ID=49757131
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/524,175 Abandoned US20130339830A1 (en) | 2012-06-15 | 2012-06-15 | Optimized document views for mobile device interfaces |
US14/833,083 Active 2033-01-12 US10867117B2 (en) | 2012-06-15 | 2015-08-22 | Optimized document views for mobile device interfaces |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/833,083 Active 2033-01-12 US10867117B2 (en) | 2012-06-15 | 2015-08-22 | Optimized document views for mobile device interfaces |
Country Status (1)
Country | Link |
---|---|
US (2) | US20130339830A1 (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140089832A1 (en) * | 2012-09-25 | 2014-03-27 | Samsung Electronics Co., Ltd. | Apparatus and method for switching split view in portable terminal |
US20150268827A1 (en) * | 2014-03-24 | 2015-09-24 | Hideep Inc. | Method for controlling moving direction of display object and a terminal thereof |
US20160104202A1 (en) * | 2014-10-09 | 2016-04-14 | Wrap Media, LLC | Wrap package of cards supporting transactional advertising |
US9330192B1 (en) * | 2014-10-09 | 2016-05-03 | Wrap Media, LLC | Method for rendering content using a card based JSON wrap package |
US9418056B2 (en) | 2014-10-09 | 2016-08-16 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9449335B2 (en) | 2014-10-09 | 2016-09-20 | Wrap Media, LLC | Delivering wrapped packages in response to the selection of advertisements |
US20160284112A1 (en) * | 2015-03-26 | 2016-09-29 | Wrap Media, LLC | Authoring tool for the mixing of cards of wrap packages |
US20170039168A1 (en) * | 2015-08-06 | 2017-02-09 | Dropbox, Inc. | Embedding Dynamic Content Item Preview |
US9582154B2 (en) | 2014-10-09 | 2017-02-28 | Wrap Media, LLC | Integration of social media with card packages |
US9600803B2 (en) | 2015-03-26 | 2017-03-21 | Wrap Media, LLC | Mobile-first authoring tool for the authoring of wrap packages |
US9600449B2 (en) | 2014-10-09 | 2017-03-21 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9606966B2 (en) | 2014-07-16 | 2017-03-28 | International Business Machines Corporation | Energy and effort efficient reading sessions |
US20180129392A1 (en) * | 2015-05-11 | 2018-05-10 | Kakao Corp. | Content display control method and user terminal for performing content display control method |
US10275142B2 (en) | 2014-10-29 | 2019-04-30 | International Business Machines Corporation | Managing content displayed on a touch screen enabled device |
US20200064978A1 (en) * | 2018-08-27 | 2020-02-27 | Sharp Kabushiki Kaisha | Display device, display method, and program |
US10671275B2 (en) * | 2014-09-04 | 2020-06-02 | Apple Inc. | User interfaces for improving single-handed operation of devices |
CN111290811A (en) * | 2020-01-20 | 2020-06-16 | 北京无限光场科技有限公司 | Page content display method and device, electronic equipment and computer readable medium |
US10929593B2 (en) | 2018-01-31 | 2021-02-23 | Microsoft Technology Licensing, Llc | Data slicing of application file objects and chunk-based user interface navigation |
CN112965646A (en) * | 2021-03-05 | 2021-06-15 | 广州文石信息科技有限公司 | Method and device for calculating page number of subdirectory of streaming document |
US11126685B2 (en) * | 2016-10-28 | 2021-09-21 | Ebay Inc. | Preview and optimization of publication for target computing device |
US11741300B2 (en) | 2017-11-03 | 2023-08-29 | Dropbox, Inc. | Embedded spreadsheet data implementation and synchronization |
US11836340B2 (en) | 2014-10-30 | 2023-12-05 | Google Llc | Systems and methods for presenting scrolling online content on mobile devices |
US11971860B2 (en) | 2015-12-28 | 2024-04-30 | Dropbox, Inc. | Embedded folder views |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111290752B (en) * | 2019-12-24 | 2024-02-20 | 明度智云(浙江)科技有限公司 | Frame processing method and device for web form |
CN111797603B (en) * | 2020-07-02 | 2022-02-01 | 北京字节跳动网络技术有限公司 | Method and device for browsing table in document, electronic equipment and storage medium |
CN113238706B (en) * | 2021-05-10 | 2023-06-20 | 北京字跳网络技术有限公司 | View display method, device, equipment and medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5893125A (en) * | 1995-01-27 | 1999-04-06 | Borland International, Inc. | Non-modal database system with methods for incremental maintenance |
US20060288280A1 (en) * | 2005-05-11 | 2006-12-21 | Nokia Corporation | User-defined changing of page representations |
US20090060452A1 (en) * | 2007-09-04 | 2009-03-05 | Apple Inc. | Display of Video Subtitles |
US20100268773A1 (en) * | 2000-04-26 | 2010-10-21 | Novarra, Inc. | System and Method for Displaying Information Content with Selective Horizontal Scrolling |
US20110231782A1 (en) * | 2000-06-12 | 2011-09-22 | Softview L.L.C. | Scalable Display of Internet Content on Mobile Devices |
US20110265002A1 (en) * | 2010-04-21 | 2011-10-27 | Research In Motion Limited | Method of interacting with a scrollable area on a portable electronic device |
US20110307772A1 (en) * | 2010-04-12 | 2011-12-15 | Google Inc. | Scrolling in Large Hosted Data Set |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5812131A (en) | 1997-03-07 | 1998-09-22 | International Business Machines Corp. | Mobile client computer programmed to process table displays |
US7365758B2 (en) | 2002-10-21 | 2008-04-29 | Microsoft Corporation | System and method for scaling data according to an optimal width for display on a mobile device |
BRPI0414379A (en) * | 2003-09-24 | 2006-11-21 | Nokia Corp | method for presenting at least a portion of an object, computing program, computing program product, and, device and system for presenting at least a portion of an object |
US8881052B2 (en) * | 2007-03-21 | 2014-11-04 | Yahoo! Inc. | Methods and systems for managing widgets through a widget dock user interface |
US20090271283A1 (en) * | 2008-02-13 | 2009-10-29 | Catholic Content, Llc | Network Media Distribution |
US20090313574A1 (en) | 2008-06-16 | 2009-12-17 | Microsoft Corporation | Mobile document viewer |
US20100169362A1 (en) * | 2008-06-27 | 2010-07-01 | Visisoft, Llc | Palette for accessing document history |
US8205168B1 (en) * | 2008-12-01 | 2012-06-19 | Adobe Systems Incorporated | Methods and systems for page navigation of dynamically laid-out systems |
US8441441B2 (en) | 2009-01-06 | 2013-05-14 | Qualcomm Incorporated | User interface for mobile devices |
TWI447641B (en) | 2009-03-31 | 2014-08-01 | Ibm | Method and computer program product for displaying document on mobile device |
US8669945B2 (en) * | 2009-05-07 | 2014-03-11 | Microsoft Corporation | Changing of list views on mobile device |
US9600919B1 (en) * | 2009-10-20 | 2017-03-21 | Yahoo! Inc. | Systems and methods for assembling and/or displaying multimedia objects, modules or presentations |
GB2524419B (en) | 2009-10-23 | 2015-10-28 | Flexenable Ltd | Electronic document reading devices |
US8432368B2 (en) | 2010-01-06 | 2013-04-30 | Qualcomm Incorporated | User interface methods and systems for providing force-sensitive input |
US20110202829A1 (en) * | 2010-02-12 | 2011-08-18 | Research In Motion Limited | Method, device and system for controlling a display according to a defined sizing parameter |
US20110258535A1 (en) * | 2010-04-20 | 2011-10-20 | Scribd, Inc. | Integrated document viewer with automatic sharing of reading-related activities across external social networks |
US9311426B2 (en) * | 2011-08-04 | 2016-04-12 | Blackberry Limited | Orientation-dependent processing of input files by an electronic device |
-
2012
- 2012-06-15 US US13/524,175 patent/US20130339830A1/en not_active Abandoned
-
2015
- 2015-08-22 US US14/833,083 patent/US10867117B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5893125A (en) * | 1995-01-27 | 1999-04-06 | Borland International, Inc. | Non-modal database system with methods for incremental maintenance |
US20100268773A1 (en) * | 2000-04-26 | 2010-10-21 | Novarra, Inc. | System and Method for Displaying Information Content with Selective Horizontal Scrolling |
US20110231782A1 (en) * | 2000-06-12 | 2011-09-22 | Softview L.L.C. | Scalable Display of Internet Content on Mobile Devices |
US20060288280A1 (en) * | 2005-05-11 | 2006-12-21 | Nokia Corporation | User-defined changing of page representations |
US20090060452A1 (en) * | 2007-09-04 | 2009-03-05 | Apple Inc. | Display of Video Subtitles |
US20110307772A1 (en) * | 2010-04-12 | 2011-12-15 | Google Inc. | Scrolling in Large Hosted Data Set |
US20110265002A1 (en) * | 2010-04-21 | 2011-10-27 | Research In Motion Limited | Method of interacting with a scrollable area on a portable electronic device |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140089832A1 (en) * | 2012-09-25 | 2014-03-27 | Samsung Electronics Co., Ltd. | Apparatus and method for switching split view in portable terminal |
US9298341B2 (en) * | 2012-09-25 | 2016-03-29 | Samsung Electronics Co., Ltd. | Apparatus and method for switching split view in portable terminal |
US20150268827A1 (en) * | 2014-03-24 | 2015-09-24 | Hideep Inc. | Method for controlling moving direction of display object and a terminal thereof |
US9747256B2 (en) | 2014-07-16 | 2017-08-29 | International Business Machines Corporation | Energy and effort efficient reading sessions |
US9606966B2 (en) | 2014-07-16 | 2017-03-28 | International Business Machines Corporation | Energy and effort efficient reading sessions |
US10671275B2 (en) * | 2014-09-04 | 2020-06-02 | Apple Inc. | User interfaces for improving single-handed operation of devices |
US9600449B2 (en) | 2014-10-09 | 2017-03-21 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9449335B2 (en) | 2014-10-09 | 2016-09-20 | Wrap Media, LLC | Delivering wrapped packages in response to the selection of advertisements |
US9448972B2 (en) * | 2014-10-09 | 2016-09-20 | Wrap Media, LLC | Wrap package of cards supporting transactional advertising |
US20160104202A1 (en) * | 2014-10-09 | 2016-04-14 | Wrap Media, LLC | Wrap package of cards supporting transactional advertising |
US9465788B2 (en) | 2014-10-09 | 2016-10-11 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9489684B2 (en) | 2014-10-09 | 2016-11-08 | Wrap Media, LLC | Delivering wrapped packages in response to the selection of advertisements |
US20160342573A1 (en) * | 2014-10-09 | 2016-11-24 | Wrap Media, LLC | Wrap package of cards supporting transactional advertising |
US9330192B1 (en) * | 2014-10-09 | 2016-05-03 | Wrap Media, LLC | Method for rendering content using a card based JSON wrap package |
US9582154B2 (en) | 2014-10-09 | 2017-02-28 | Wrap Media, LLC | Integration of social media with card packages |
US9418056B2 (en) | 2014-10-09 | 2016-08-16 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9582813B2 (en) | 2014-10-09 | 2017-02-28 | Wrap Media, LLC | Delivering wrapped packages in response to the selection of advertisements |
US9600452B2 (en) * | 2014-10-09 | 2017-03-21 | Wrap Media, LLC | Wrap package of cards supporting transactional advertising |
US9600464B2 (en) | 2014-10-09 | 2017-03-21 | Wrap Media, LLC | Authoring tool for the authoring of wrap packages of cards |
US9600594B2 (en) | 2014-10-09 | 2017-03-21 | Wrap Media, LLC | Card based package for distributing electronic media and services |
US9448988B2 (en) | 2014-10-09 | 2016-09-20 | Wrap Media Llc | Authoring tool for the authoring of wrap packages of cards |
US11379112B2 (en) | 2014-10-29 | 2022-07-05 | Kyndryl, Inc. | Managing content displayed on a touch screen enabled device |
US10275142B2 (en) | 2014-10-29 | 2019-04-30 | International Business Machines Corporation | Managing content displayed on a touch screen enabled device |
US11836340B2 (en) | 2014-10-30 | 2023-12-05 | Google Llc | Systems and methods for presenting scrolling online content on mobile devices |
US12248674B2 (en) | 2014-10-30 | 2025-03-11 | Google Llc | Systems and methods for presenting scrolling online content on mobile devices |
US9582917B2 (en) * | 2015-03-26 | 2017-02-28 | Wrap Media, LLC | Authoring tool for the mixing of cards of wrap packages |
US9600803B2 (en) | 2015-03-26 | 2017-03-21 | Wrap Media, LLC | Mobile-first authoring tool for the authoring of wrap packages |
US20160284112A1 (en) * | 2015-03-26 | 2016-09-29 | Wrap Media, LLC | Authoring tool for the mixing of cards of wrap packages |
US20180129392A1 (en) * | 2015-05-11 | 2018-05-10 | Kakao Corp. | Content display control method and user terminal for performing content display control method |
US10795564B2 (en) * | 2015-05-11 | 2020-10-06 | Kakao Corp. | Content display control method and user terminal for performing content display control method |
US9767078B2 (en) * | 2015-08-06 | 2017-09-19 | Dropbox, Inc. | Embedding dynamic content item preview |
US10013397B2 (en) | 2015-08-06 | 2018-07-03 | Dropbox, Inc. | Embedding dynamic content item preview |
US20170039168A1 (en) * | 2015-08-06 | 2017-02-09 | Dropbox, Inc. | Embedding Dynamic Content Item Preview |
US11971860B2 (en) | 2015-12-28 | 2024-04-30 | Dropbox, Inc. | Embedded folder views |
US11126685B2 (en) * | 2016-10-28 | 2021-09-21 | Ebay Inc. | Preview and optimization of publication for target computing device |
US11741300B2 (en) | 2017-11-03 | 2023-08-29 | Dropbox, Inc. | Embedded spreadsheet data implementation and synchronization |
US10929593B2 (en) | 2018-01-31 | 2021-02-23 | Microsoft Technology Licensing, Llc | Data slicing of application file objects and chunk-based user interface navigation |
US11379107B2 (en) * | 2018-08-27 | 2022-07-05 | Sharp Kabushiki Kaisha | Display device, display method, and program |
US20200064978A1 (en) * | 2018-08-27 | 2020-02-27 | Sharp Kabushiki Kaisha | Display device, display method, and program |
CN111290811A (en) * | 2020-01-20 | 2020-06-16 | 北京无限光场科技有限公司 | Page content display method and device, electronic equipment and computer readable medium |
CN112965646A (en) * | 2021-03-05 | 2021-06-15 | 广州文石信息科技有限公司 | Method and device for calculating page number of subdirectory of streaming document |
Also Published As
Publication number | Publication date |
---|---|
US10867117B2 (en) | 2020-12-15 |
US20150363366A1 (en) | 2015-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10867117B2 (en) | Optimized document views for mobile device interfaces | |
US10782844B2 (en) | Smart whiteboard interactions | |
US9003298B2 (en) | Web page application controls | |
US8966361B2 (en) | Providing summary view of documents | |
JP6165154B2 (en) | Content adjustment to avoid occlusion by virtual input panel | |
US20130191785A1 (en) | Confident item selection using direct manipulation | |
US20080288894A1 (en) | User interface for documents table of contents | |
US20120272144A1 (en) | Compact control menu for touch-enabled command execution | |
KR102369604B1 (en) | Presenting fixed format documents in reflowed format | |
US20150169504A1 (en) | Layer based reorganization of document components | |
US20150033188A1 (en) | Scrollable smart menu | |
US9164972B2 (en) | Managing objects in panorama display to navigate spreadsheet | |
US20090313574A1 (en) | Mobile document viewer | |
US11379112B2 (en) | Managing content displayed on a touch screen enabled device | |
US9792268B2 (en) | Zoomable web-based wall with natural user interface | |
US20140164911A1 (en) | Preserving layout of region of content during modification | |
US9367223B2 (en) | Using a scroll bar in a multiple panel user interface | |
EP2825947A1 (en) | Web page application controls | |
US20150058710A1 (en) | Navigating fixed format document in e-reader application | |
US20140351745A1 (en) | Content navigation having a selection function and visual indicator thereof | |
US20240012555A1 (en) | Identifying and navigating to a visual item on a web page | |
US20120124514A1 (en) | Presentation focus and tagging | |
CN114969398A (en) | Interface display method and device, electronic equipment and readable storage medium | |
US20150052429A1 (en) | Interface method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YUAN, SHARLENE;CHANG, JACKIE;WANG, BUDDHA;AND OTHERS;SIGNING DATES FROM 20120503 TO 20120509;REEL/FRAME:028389/0436 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0541 Effective date: 20141014 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |