US20180181273A1 - Display control device, display control method, and recording medium - Google Patents
Display control device, display control method, and recording medium Download PDFInfo
- Publication number
- US20180181273A1 US20180181273A1 US15/904,654 US201815904654A US2018181273A1 US 20180181273 A1 US20180181273 A1 US 20180181273A1 US 201815904654 A US201815904654 A US 201815904654A US 2018181273 A1 US2018181273 A1 US 2018181273A1
- Authority
- US
- United States
- Prior art keywords
- orientation
- unit
- display
- display screen
- display control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title description 9
- 230000010365 information processing Effects 0.000 claims description 28
- 238000003672 processing method Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 30
- 230000006870 function Effects 0.000 description 24
- 230000008859 change Effects 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present disclosure relates to a display control device, a display control method, and a recording medium.
- HMD head mounted display
- Patent Literature 1 JP 2013-12024A
- a display control device including: a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit; and a display control unit configured to display content at the display unit.
- the display control unit scrolls the content according to the orientation of the display unit.
- a display control method including: acquiring orientation of a display unit detected by a first detecting unit; displaying content at the display unit; and scrolling the content according to the orientation of the display unit.
- a computer-readable recording medium having a program recorded therein, the program causing a computer to function as a display control device including: a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit; and a display control unit configured to display content at the display unit.
- the display control unit scrolls the content according to the orientation of the display unit.
- FIG. 1 is a diagram illustrating outline of an information processing system according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating a configuration example of functions of the information processing system according to the embodiment of the present disclosure.
- FIG. 3 is a diagram illustrating an example where content is scrolled in a first example.
- FIG. 4 is a diagram illustrating an example where scrolling is stopped in the first example.
- FIG. 5 is a diagram illustrating an example of operation for selecting content in the first example.
- FIG. 6 is a diagram illustrating an example where a selection result screen is displayed as a result of content being selected in the first example.
- FIG. 7 is a diagram illustrating an example of operation for returning display from the selection result screen to the previous screen in the first example.
- FIG. 8 is a diagram illustrating an example where display is returned from the selection result screen to the previous screen in the first example.
- FIG. 9 is a flowchart illustrating flow of operation of a display control device in the first example.
- FIG. 10 is a diagram illustrating an example where focus is moved in a second example.
- FIG. 11 is a diagram illustrating an example where object is selected in the second example.
- FIG. 12 is a diagram illustrating an example where a selection result screen is displayed as a result of object being selected in the second example.
- FIG. 13 is a diagram illustrating an example where display is returned from the selection result screen to display of the object in the second example.
- FIG. 14 is a flowchart illustrating flow of operation of a display control device in the second example.
- FIG. 15 is a diagram illustrating an example where content is scrolled in a third example.
- FIG. 16 is a flowchart illustrating flow of operation of a display control device in the third example.
- FIG. 17 is a diagram illustrating a modification of the first example and the third example.
- FIG. 18 is a diagram illustrating a hardware configuration example of a display control device according to the embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating outline of the information processing system 1 according to the embodiment of the present disclosure.
- the information processing system 1 includes a display control device 10 , a display unit 150 , and a terminal 170 .
- the display unit 150 has a function of displaying a screen 50 according to a control signal provided in a wireless or wired manner from the display control device 10 .
- Types of the screen 50 displayed at the display unit 150 are not particularly limited. As illustrated in FIG. 1 , a case is mainly assumed where the display unit 150 is a head mount display (HMD) to be mounted on the head of a user U. Types of the HMD are not particularly limited, and the HMD may be a transmission type HMD or a non-transmission type HMD.
- HMD head mount display
- the terminal 170 is a terminal used by the user U. Types of the terminal 170 are not particularly limited, and the terminal 170 may be a video camera, a smartphone or a personal digital assistants (PDA). Alternatively, the terminal 170 may be a personal computer (PC), a mobile phone, a mobile music reproduction device, a mobile image processing device, or mobile game equipment.
- PC personal computer
- the display control device 10 has a function of displaying the screen 50 at the display unit 150 .
- the screen 50 may include content C 1 to C 7 as illustrated in FIG. 1 or may include objects as will be described later.
- the number of pieces of content is not particularly limited.
- types of content are not particularly limited, and, as illustrated in FIG. 1 , the content may be an image (such as, for example, a still image and a moving image) or may be text data.
- the display control device 10 is configured separately from the display unit 150 in the example illustrated in FIG. 1 , the display control device 10 may be integrated with the display unit 150 .
- FIG. 2 is a diagram illustrating the configuration example of the functions of the information processing system 1 according to the embodiment of the present disclosure.
- the display control device 10 includes a control unit 110 , a storage unit 120 , a first input unit 131 and a second input unit 132 .
- the control unit 110 corresponds to, for example a processor such as a central processing unit (CPU).
- the control unit 110 fulfills various functions of the control unit 110 by executing a program stored in the storage unit 120 or other storage media.
- the control unit 110 includes a first acquiring unit 111 , a second acquiring unit 112 and a display control unit 113 . Functions of these respective function blocks will be described later.
- the storage unit 120 stores a program for operating the control unit 110 using a semiconductor memory or a storage medium such as a hard disc. Further, for example, the storage unit 120 can also store various kinds of data (such as, for example, content and objects) to be used by the program. Note that while the storage unit 120 is integrated with the display control device 10 in the example illustrated in FIG. 2 , the storage unit 120 may be configured separately from the display control device 10 .
- the display unit 150 is connected to the display control device 10 in a wireless or wired manner.
- the display unit 150 includes a first detecting unit 151
- the terminal 170 includes a second detecting unit 171 .
- the first detecting unit 151 has a function of detecting orientation of the display unit 150 .
- the orientation of the display unit 150 detected by the first detecting unit 151 is input to the control unit 110 via the first input unit 131 .
- the second detecting unit 171 has a function of detecting orientation of the terminal 170 .
- the orientation of the terminal 170 detected by the second detecting unit 171 is input to the control unit 110 via the second detecting unit 171 .
- the first detecting unit 151 detects the orientation of the display unit 150 .
- the first detecting unit 151 may include a geomagnetic sensor and measure geomagnetic data indicating orientation of geomagnetism in a coordinate system of the display unit 150 using the geomagnetic sensor.
- the geomagnetic data can be utilized for, for example, a direction the display unit 150 faces (orientation in a horizontal direction).
- the first detecting unit 151 may include an acceleration sensor and measure acceleration applied to the display unit 150 using the acceleration sensor.
- the acceleration can be utilized for, for example, tilt of the display unit 150 (orientation in a vertical direction).
- the first detecting unit 151 may further include a gyro sensor and measure an angular velocity of rotation of the display unit 150 using the gyro sensor and detect the angular velocity as a change rate of the orientation of the display unit 150 .
- the change rate of the orientation of the display unit 150 detected in this manner can be utilized for detection of the orientation of the display unit 150 at the first detecting unit 151 .
- the first detecting unit 151 is integrated with the display unit 150 in the example illustrated in FIG. 2 , the first detecting unit 151 may be configured separately from the display unit 150 .
- the second detecting unit 171 detects the orientation of the terminal 170 .
- the second detecting unit 171 may include a geomagnetic sensor and measure geomagnetic data indicating orientation of geomagnetism in a coordinate system of the terminal 170 using the geomagnetic sensor.
- the geomagnetic data can be utilized for, for example, a direction the terminal 170 faces (orientation in a horizontal direction).
- the second detecting unit 171 may include an acceleration sensor, measure acceleration applied to the terminal 170 using the acceleration sensor and detect change of the orientation of the terminal 170 based on the acceleration.
- the acceleration can be utilized for, for example, tilt of the terminal 170 (orientation in a vertical direction).
- the second detecting unit 171 may further include a gyro sensor, measure an angular velocity of rotation of the terminal 170 using the gyro sensor and detect the angular velocity as a change rate of the orientation of the terminal 170 .
- the change rate of the orientation of the terminal 170 detected in this manner can be utilized for detection of the orientation of the terminal 170 at the second detecting unit 171 .
- the second detecting unit 171 is integrated with the terminal 170 in the example illustrated in FIG. 2
- the second detecting unit 171 may be configured separately from the second detecting unit 171 .
- FIG. 3 is a diagram illustrating an example of a case where content is scrolled in the first example.
- the user wears the display unit 150 on his head and has the terminal 170 with part of his body (for example, a hand).
- the display control unit 113 displays the screen 50 -A 1 at the display unit 150 .
- the screen 50 -A 1 includes content C 1 to C 7 .
- the first acquiring unit 111 acquires the orientation u of the display unit detected by the first detecting unit 151 . Further, the second acquiring unit 112 acquires the orientation t of the terminal detected by the second detecting unit 171 .
- the display control unit 113 scrolls content according to the orientation u of the display unit. By this means, it is possible to allow the user to easily scroll the content. Further, the user can scroll the content intuitively.
- the orientation t of the terminal can be also used by the display control unit 113 .
- the display control unit 113 may perform predetermined control according to the orientation t of the terminal.
- the display control unit 113 may scroll the content according to a relationship between the orientation u of the display unit and the orientation t of the terminal.
- the display control unit 113 may scroll the content according to the orientation t of the terminal which is relative to the orientation u of the display unit. More specifically, the display control unit 113 may scroll the content in opposite orientation to the orientation t of the terminal which is relative to the orientation u of the display unit.
- FIG. 3 illustrates a case where the orientation t of the terminal which is relative to the orientation u of the display unit is leftward when viewed from the user. At this time, the display control unit 113 may scroll the content to the right, which is opposite to the left, in order for the user to easily view a left side of the screen 50 -A 1 .
- Scroll speed of the content is not particularly limited, and, for example, the display control unit 113 may control the scroll speed of the content according to an angular difference between the orientation u of the display unit and the orientation t of the terminal. More specifically, the display control unit 113 may increase the scroll speed of the content for a larger angular difference between the orientation u of the display unit and the orientation t of the terminal. This control allows the user to adjust the scroll speed of the content intuitively.
- the display control unit 113 may scroll the content according to the orientation u of the display unit which is relative to the reference orientation. More specifically, the display control unit 113 may scroll the content in opposite orientation to the orientation u of the display unit which is relative to the reference orientation.
- the reference orientation may be set in advance by the user or may be set by the control unit 110 .
- FIG. 4 is a diagram illustrating an example of a case where scrolling is stopped in the first example. Referring to FIG. 4 , because the orientation u of the display unit matches the orientation t of the terminal after the content is scrolled, a screen 50 -A 2 including content C 4 to C 10 is displayed by the display control unit 113 and scrolling of the content is stopped in a state where the screen 50 -A 2 is displayed.
- content on which focus is placed is not particularly limited.
- the display control unit 113 may select content on which focus is placed.
- the predetermined operation may include operation of the user tapping the terminal 170 .
- FIG. 5 is a diagram illustrating an example of operation for selecting the content C 7 in the first example.
- the display control unit 113 selects the content C 7 on which focus is placed.
- the display control unit 113 perform any control on the content C 7 after the content C 7 is selected. For example, as a result of the content C 7 being selected, a selection result screen including enlarged content C 7 may be displayed.
- FIG. 6 is a diagram illustrating an example of a case where a selection result screen 50 -A 3 is displayed as a result of the content C 7 being selected in the first example.
- the display control unit 113 displays the selection result screen 50 -A 3 including the enlarged content C 7 at the display unit 150 .
- the display control unit 113 may be able to return the screen to the previous screen. While the predetermined operation is not particularly limited, the predetermined operation may include operation of the user holding down a button on the terminal 170 for a long period of time.
- FIG. 7 is a diagram illustrating an example of operation for returning display from the selection result screen 50 -A 3 to the previous screen 50 -A 2 in the first example.
- FIG. 8 is a diagram illustrating an example of a case where display is returned from the selection result screen 50 -A 3 to the previous screen 50 -A 2 in the first example.
- the display control unit 113 displays the previous screen 50 -A 2 at the display unit 150 .
- FIG. 9 is a flowchart illustrating flow of operation of the display control device 10 in the first example.
- FIG. 9 is merely one example of the flow of the operation of the display control device 10 in the first example. Therefore, the flow of the operation of the display control device 10 in the first example is not limited to the example illustrated in FIG. 9 .
- the orientation u of the display unit is detected by the first detecting unit 151
- the orientation u of the display unit is input to the control unit 110 via the first input unit 131
- the orientation u of the display unit is acquired by the first acquiring unit 111 (S 11 ).
- the orientation t of the terminal is detected by the second detecting unit 171
- the orientation t of the terminal is input to the control unit 110 via the second input unit 132
- the orientation u of the display unit is acquired by the second acquiring unit 112 (S 12 ).
- the display control unit 113 scrolls the content according to the orientation t of the terminal which is relative to the orientation u of the display unit (S 13 ). Note that after the operation of S 13 is finished, the control unit 110 may return to the operation of S 11 again or may finish the operation.
- the first example has been described above.
- FIG. 10 is a diagram illustrating an example of a case where focus is moved in the second example.
- the user wears the display unit 150 on his head and has the terminal 170 with part of his body (for example, a hand).
- the display control unit 113 displays a screen 50 -B 1 at the display unit 150 .
- the screen 50 -B 1 includes objects B 1 to B 4 .
- the first acquiring unit 111 acquires the orientation u of the display unit detected by the first detecting unit 151 . Further, the second acquiring unit 112 acquires the orientation t of the terminal detected by the second detecting unit 171 .
- the display control unit 113 moves focus according to the orientation t of the terminal. By this means, the user can easily move the focus.
- the orientation u of the display unit can be also used by the display control unit 113 .
- the display control unit 113 may perform predetermined control according to the orientation u of the display unit.
- the display control unit 113 may move the focus according to a relationship between the orientation u of the display unit and the orientation t of the terminal.
- the display control unit 113 may move the focus according to the orientation t of the terminal which is relative to the orientation u of the display unit. More specifically, the display control unit 113 may move the focus according to the orientation t of the terminal which is relative to the orientation u of the display unit.
- FIG. 10 illustrates a case where the orientation t of the terminal which is relative to the orientation u of the display unit is leftward when viewed from the user. At this time, the display control unit 113 may move the focus to a position of an object B 1 in an upper left region of the screen 50 -B 1 .
- the display control unit 113 may move the focus according to the orientation t of the terminal which is relative to the reference orientation. More specifically, the display control unit 113 may move the focus according to the orientation t of the terminal which is relative to the reference orientation.
- the reference orientation may be set in advance by the user or may be set by the control unit 110 .
- the display control unit 113 may be able to select the object on which focus is placed.
- the predetermined operation is not particularly limited, and may include operation of the user tapping the terminal 170 .
- FIG. 11 is a diagram illustrating an example of a case where the object B 1 is selected in a second example.
- the display control unit 113 selects the object B 1 on which focus is placed. After the object B 1 is selected, the display control unit 113 may perform any control on the object B 1 . For example, as a result of the object B 1 being selected, a selection result screen including content corresponding to the object B 1 may be displayed.
- FIG. 12 is a diagram illustrating an example of a case where a selection result screen 50 -B 2 is displayed as a result of the object B 1 being selected in the second example.
- the display control unit 113 displays the selection result screen 50 -B 2 including content C 11 corresponding to the object B 1 at the display unit 150 .
- the display control unit 113 may be able to return the screen to the previous screen.
- the predetermined operation is not particularly limited, and may include operation of the user holding down a button on the terminal 170 for a long period of time.
- FIG. 13 is a diagram illustrating an example of a case where display is returned from the selection result screen 50 -B 2 to the previous screen 50 -B 1 in the second example.
- the display control unit 113 displays the previous screen 50 -B 1 .
- FIG. 14 is a flowchart illustrating flow of operation of the display control device 10 in the second example. Note that the example illustrated in FIG. 14 is merely an example of flow of the operation of the display control device 10 in the second example. Therefore, the flow of the operation of the display control device 10 in the second example is not limited to the example illustrated in FIG. 14 .
- the orientation u of the display unit is detected by the first detecting unit 151
- the orientation u of the display unit is input to the control unit 110 via the first input unit 131
- the orientation u of the display unit is acquired by the first acquiring unit 111 (S 21 ).
- the orientation t of the terminal is detected by the second detecting unit 171
- the orientation t of the terminal is input to the control unit 110 via the second input unit 132
- the orientation u of the display unit is acquired by the second acquiring unit 112 (S 22 ).
- the display control unit 113 selects an object according to the orientation t of the terminal which is relative to the orientation u of the display unit (S 23 ). Note that after the operation of S 23 is finished, the control unit 110 may return to the operation of S 21 again or may finish the operation.
- the second example has been described above.
- FIG. 15 is a diagram illustrating an example of a case where focus is moved in the third example.
- the user wears the display unit 150 on his head and has the terminal 160 with part of his body (for example, a hand).
- the display control unit 113 displays the screen 50 -C 1 at the display unit 150 .
- the screen 50 -C 1 includes content C 1 to C 7 and objects B 1 to B 4 .
- the first acquiring unit 111 acquires the orientation u of the display unit detected by the first detecting unit 151 . Further, the second acquiring unit 112 acquires the orientation t of the terminal detected by the second detecting unit 171 .
- the display control unit 113 scrolls content according to the orientation u of the display unit. Further, the display control unit 113 selects an object according to the orientation t of the terminal. By this means, it becomes possible to easily scroll content and easily select an object.
- the display control unit 113 may scroll the content according to the orientation u of the display unit which is relative to reference orientation. More specifically, the display control unit 113 may scroll the content in opposite orientation to the orientation u of the display unit which is relative to the reference orientation.
- the reference orientation may be set in advance by the user or may be set by the control unit 110 .
- the display control unit 113 may select an object according to the orientation t of the terminal which is relative to reference orientation. More specifically, the display control unit 113 may select an object according to the orientation t which is relative to the reference orientation.
- the reference orientation may be set in advance by the user or may be set by the control unit 110 .
- FIG. 16 is a flowchart illustrating flow of operation of the display control device 10 in the third example. Note that the example illustrated in FIG. 16 is merely an example of the flow of the operation of the display control device 10 in the third example. Therefore, the flow of the operation of the display control device 10 in the third example is not limited to the example illustrated in FIG. 16 .
- the orientation u of the display unit is detected by the first detecting unit 151
- the orientation u of the display unit is input to the control unit 110 via the first input unit 131
- the orientation u of the display unit is acquired by the first acquiring unit 111 (S 31 ).
- the orientation t of the terminal is detected by the second detecting unit 171
- the orientation t of the terminal is input to the control unit 110 via the second input unit 132
- the orientation u of the display unit is acquired by the second acquiring unit 112 (S 32 ).
- the display control unit 113 scrolls content according to the orientation u of the display unit (S 33 ). Further, the display control unit 113 selects an object according to the orientation t of the terminal (S 34 ). Note that after the operation of S 34 is finished, the control unit 110 may return to the operation of S 31 again or may finish the operation.
- the third example has been described above.
- the display control unit 113 may determine whether or not to scroll the content based on whether or not the user inputs predetermined operation. While a case will been described in the following description where the predetermined operation is touch operation, the predetermined operation may be any operation.
- FIG. 17 is a diagram illustrating a modification of the first example and the third example.
- the user wears the display unit 150 on his head and has the terminal 170 with part of his body (for example, a hand).
- the display control unit 113 displays a screen 50 -D 1 at the display unit 150 .
- the screen 50 -D 1 includes content C 1 to C 7 .
- the first acquiring unit 111 acquires the orientation u of the display unit detected by the first detecting unit 151 . Further, the second acquiring unit 112 acquires the orientation t of the terminal detected by the second detecting unit 171 .
- the display control unit 113 may determine not to scroll the content while the user does not input touch operation. Meanwhile, as illustrated in a screen 50 -D 2 , the display control unit 113 may determine to scroll the content while the user inputs touch operation.
- the operation performed while the user does not input touch operation may be replaced with the operation performed while the user inputs touch operation. That is, the display control unit 113 may determine not to scroll the content while the user inputs touch operation, while the display control unit 113 may determine to scroll the content while the user does not input touch operation.
- the display control unit 113 selects an object according to the orientation t of the terminal.
- the display control unit 113 may determine whether or not to select an object based on whether or not the user inputs predetermined operation.
- the display control unit 113 may determine not to select an object while the user does not input touch operation. Meanwhile, the display control unit 113 may determine to select an object while the user inputs touch operation.
- the operation performed while the user inputs touch operation may be replaced with the operation performed while the user does not input touch operation. That is, the display control unit 113 may determine not to select an object while the user inputs touch operation, while the display control unit 113 may determine to select an object while the user does not input touch operation.
- FIG. 18 is a diagram illustrating a hardware configuration example of the display control device 10 according to the embodiment of the present disclosure.
- the hardware configuration example illustrated in FIG. 18 is merely an example of a hardware configuration of the display control device 10 . Therefore, the hardware configuration of the display control device 10 is not limited to the example illustrated in FIG. 18 .
- the display control device 10 includes a central processing unit (CPU) 901 , a read-only memory (ROM) 902 , a random-access memory (RAM) 903 , an input device 908 , an output device 910 , a storage device 911 , and a drive 912 .
- CPU central processing unit
- ROM read-only memory
- RAM random-access memory
- the CPU 901 which functions as a calculation processing device and a control device, controls the overall operation of the display control device 10 based on various programs. Further, the CPU 901 may be a microprocessor.
- the ROM 902 stores programs, calculation parameters and the like used by the CPU 901 .
- the RAM 903 temporarily stores the programs to be used during execution by the CPU 901 , and parameters that appropriately change during that execution. These units are connected to each other by a host bus, which is configured from a CPU bus or the like.
- the input device 908 receives input of the orientation of the display unit 150 detected by the first detecting unit 151 and the orientation of the terminal 170 detected by the second detecting unit 171 .
- the orientation of the display unit 150 and the orientation of the terminal 170 received at the input device 908 is output to the CPU 901 .
- the input device 908 may receive input of a change amount of the orientation of the display unit 150 detected by the first detecting unit 151 and a change amount of the orientation of the terminal 170 detected by the second detecting unit 171 and output the change amounts to the CPU 901 .
- the output device 910 provides output data to the display unit 150 .
- the output device 910 provides display data to the display unit 150 under the control of the CPU 901 .
- the display unit 150 is configured from an audio output device, the output device 910 provides audio data to the display unit 150 under the control of the CPU 901 .
- the storage device 911 is a device used to store data that is configured as an example of the storage unit 120 in the display control device 10 .
- the storage device 911 may also include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium and the like.
- This storage device 911 stores programs executed by the CPU 901 and various kinds of data.
- the drive 912 is a storage medium reader/writer, which may be built-in or externally attached to the display control device 10 .
- the drive 912 reads information recorded on a removable storage medium 71 , such as a mounted magnetic disk, optical disc, magneto-optical disk, or semiconductor memory, and outputs the read information to the RAM 903 . Further, the drive 912 can also write information to the removable storage medium 71 .
- a display control device 10 which includes a first acquiring unit 111 configured to acquire orientation of a display unit 150 detected by the first detecting unit 151 , and a display control unit 113 configured to display content at the display unit 150 , wherein the display control unit 113 scrolls the content according to the orientation of the display unit 150 .
- a first acquiring unit 111 configured to acquire orientation of a display unit 150 detected by the first detecting unit 151
- a display control unit 113 configured to display content at the display unit 150 , wherein the display control unit 113 scrolls the content according to the orientation of the display unit 150 .
- a program for realizing the same functions as the units included in the above-described display control device 10 can also recreate the hardware, such as the CPU, the ROM, and the RAM, that is included in the computer.
- a computer-readable recording medium having this program recorded thereon can also be provided.
- present technology may also be configured as below.
- a display control device including:
- a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit
- a display control unit configured to display content at the display unit
- the display control unit scrolls the content according to the orientation of the display unit.
- the display control device further including:
- a second acquiring unit configured to acquire orientation of a terminal detected by a second detecting unit
- the display control unit performs predetermined control according to the orientation of the terminal.
- the display control unit scrolls the content according to a relationship between the orientation of the display unit and the orientation of the terminal.
- the display control unit scrolls the content according to the orientation of the terminal which is relative to the orientation of the display unit.
- the display control unit controls scroll speed of the content according to an angular difference between the orientation of the display unit and the orientation of the terminal.
- the display control unit scrolls the content according to the orientation of the display unit and selects an object based on the orientation of the terminal.
- the display control unit determines whether or not to scroll the content based on whether or not a user inputs predetermined operation.
- a display control method including:
- a computer-readable recording medium having a program recorded therein, the program causing a computer to function as a display control device including:
- a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit
- a display control unit configured to display content at the display unit
- the display control unit scrolls the content according to the orientation of the display unit.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
There is provided a display control device including a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit, and a display control unit configured to display content at the display unit. The display control unit scrolls the content according to the orientation of the display unit.
Description
- The present application is a continuation application of U.S. patent application Ser. No. 14/889,569, filed Nov. 6, 2015, and claims the benefit of Japanese Priority Patent Application JP 2013-102885 filed May 15, 2013, the entire contents of which are incorporated herein by reference. Each of the above-referenced applications is hereby incorporated herein by reference in its entirety.
- The present disclosure relates to a display control device, a display control method, and a recording medium.
- In recent years, a head mounted display (HMD) as a display to be mounted on the head of a user has been developed. While it is assumed that fixed content is displayed at the HMD mounted on the head of the user regardless of orientation of the head of the user, it is also assumed that content may be changed based on orientation of the head of the user. For example, a technique of specifying content to be displayed at the HMD based on the orientation of the head of the user is disclosed (see, for example, Patent Literature 1).
- Patent Literature 1: JP 2013-12024A
- However, it is desirable to realize a technique for allowing a user to easily scroll content.
- According to the present disclosure, there is provided a display control device including: a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit; and a display control unit configured to display content at the display unit. The display control unit scrolls the content according to the orientation of the display unit.
- Further, according to the present disclosure, there is provided a display control method including: acquiring orientation of a display unit detected by a first detecting unit; displaying content at the display unit; and scrolling the content according to the orientation of the display unit.
- There is provided a computer-readable recording medium having a program recorded therein, the program causing a computer to function as a display control device including: a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit; and a display control unit configured to display content at the display unit. The display control unit scrolls the content according to the orientation of the display unit.
- As described above, according to the present disclosure, it is possible to provide a technique for allowing a user to easily scroll content.
-
FIG. 1 is a diagram illustrating outline of an information processing system according to an embodiment of the present disclosure. -
FIG. 2 is a diagram illustrating a configuration example of functions of the information processing system according to the embodiment of the present disclosure. -
FIG. 3 is a diagram illustrating an example where content is scrolled in a first example. -
FIG. 4 is a diagram illustrating an example where scrolling is stopped in the first example. -
FIG. 5 is a diagram illustrating an example of operation for selecting content in the first example. -
FIG. 6 is a diagram illustrating an example where a selection result screen is displayed as a result of content being selected in the first example. -
FIG. 7 is a diagram illustrating an example of operation for returning display from the selection result screen to the previous screen in the first example. -
FIG. 8 is a diagram illustrating an example where display is returned from the selection result screen to the previous screen in the first example. -
FIG. 9 is a flowchart illustrating flow of operation of a display control device in the first example. -
FIG. 10 is a diagram illustrating an example where focus is moved in a second example. -
FIG. 11 is a diagram illustrating an example where object is selected in the second example. -
FIG. 12 is a diagram illustrating an example where a selection result screen is displayed as a result of object being selected in the second example. -
FIG. 13 is a diagram illustrating an example where display is returned from the selection result screen to display of the object in the second example. -
FIG. 14 is a flowchart illustrating flow of operation of a display control device in the second example. -
FIG. 15 is a diagram illustrating an example where content is scrolled in a third example. -
FIG. 16 is a flowchart illustrating flow of operation of a display control device in the third example. -
FIG. 17 is a diagram illustrating a modification of the first example and the third example. -
FIG. 18 is a diagram illustrating a hardware configuration example of a display control device according to the embodiment of the present disclosure. - Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
- Further, in this specification and the appended drawings, structural elements that have substantially the same function and structure are in some cases differentiated by denoting with different alphabet letters provided after the same reference numeral. However, in cases where it is not necessary to distinguish among a plurality of structural elements having substantially the same function and structure, such structural elements are denoted using just the same reference numeral.
- Further, the “Description of Embodiment” will be described along the following items:
- 1-1. Outline of information processing system
- 1-2. Configuration example of functions of information processing system
- 1-3. Details of functions of information processing system
- 1-4. Hardware configuration example
- An embodiment of the present disclosure will be described first.
- First, a configuration example of an
information processing system 1 according to the embodiment of the present disclosure will be described.FIG. 1 is a diagram illustrating outline of theinformation processing system 1 according to the embodiment of the present disclosure. As illustrated inFIG. 1 , theinformation processing system 1 includes adisplay control device 10, adisplay unit 150, and a terminal 170. - The
display unit 150 has a function of displaying ascreen 50 according to a control signal provided in a wireless or wired manner from thedisplay control device 10. Types of thescreen 50 displayed at thedisplay unit 150 are not particularly limited. As illustrated inFIG. 1 , a case is mainly assumed where thedisplay unit 150 is a head mount display (HMD) to be mounted on the head of a user U. Types of the HMD are not particularly limited, and the HMD may be a transmission type HMD or a non-transmission type HMD. - The terminal 170 is a terminal used by the user U. Types of the terminal 170 are not particularly limited, and the terminal 170 may be a video camera, a smartphone or a personal digital assistants (PDA). Alternatively, the terminal 170 may be a personal computer (PC), a mobile phone, a mobile music reproduction device, a mobile image processing device, or mobile game equipment.
- The
display control device 10 has a function of displaying thescreen 50 at thedisplay unit 150. Thescreen 50 may include content C1 to C7 as illustrated inFIG. 1 or may include objects as will be described later. The number of pieces of content is not particularly limited. Further, types of content are not particularly limited, and, as illustrated inFIG. 1 , the content may be an image (such as, for example, a still image and a moving image) or may be text data. Note that while thedisplay control device 10 is configured separately from thedisplay unit 150 in the example illustrated inFIG. 1 , thedisplay control device 10 may be integrated with thedisplay unit 150. - In the present specification, a technique for allowing the user U to easily scroll the content to be displayed at the
display unit 150 in this manner will be mainly described. - The outline of the
information processing system 1 according to the embodiment of the present disclosure has been described above. - A configuration example of functions of the
information processing system 1 according to the embodiment of the present disclosure will be described next.FIG. 2 is a diagram illustrating the configuration example of the functions of theinformation processing system 1 according to the embodiment of the present disclosure. As illustrated inFIG. 2 , thedisplay control device 10 according to the embodiment of the present disclosure includes acontrol unit 110, astorage unit 120, afirst input unit 131 and asecond input unit 132. - The
control unit 110 corresponds to, for example a processor such as a central processing unit (CPU). Thecontrol unit 110 fulfills various functions of thecontrol unit 110 by executing a program stored in thestorage unit 120 or other storage media. Thecontrol unit 110 includes a first acquiringunit 111, a second acquiringunit 112 and adisplay control unit 113. Functions of these respective function blocks will be described later. - The
storage unit 120 stores a program for operating thecontrol unit 110 using a semiconductor memory or a storage medium such as a hard disc. Further, for example, thestorage unit 120 can also store various kinds of data (such as, for example, content and objects) to be used by the program. Note that while thestorage unit 120 is integrated with thedisplay control device 10 in the example illustrated inFIG. 2 , thestorage unit 120 may be configured separately from thedisplay control device 10. - As described above, the
display unit 150 is connected to thedisplay control device 10 in a wireless or wired manner. Thedisplay unit 150 includes a first detectingunit 151, and the terminal 170 includes a second detectingunit 171. The first detectingunit 151 has a function of detecting orientation of thedisplay unit 150. The orientation of thedisplay unit 150 detected by the first detectingunit 151 is input to thecontrol unit 110 via thefirst input unit 131. Meanwhile, the second detectingunit 171 has a function of detecting orientation of the terminal 170. The orientation of the terminal 170 detected by the second detectingunit 171 is input to thecontrol unit 110 via the second detectingunit 171. - The first detecting
unit 151 detects the orientation of thedisplay unit 150. For example, the first detectingunit 151 may include a geomagnetic sensor and measure geomagnetic data indicating orientation of geomagnetism in a coordinate system of thedisplay unit 150 using the geomagnetic sensor. The geomagnetic data can be utilized for, for example, a direction thedisplay unit 150 faces (orientation in a horizontal direction). Further, the first detectingunit 151 may include an acceleration sensor and measure acceleration applied to thedisplay unit 150 using the acceleration sensor. The acceleration can be utilized for, for example, tilt of the display unit 150 (orientation in a vertical direction). - Further, the first detecting
unit 151 may further include a gyro sensor and measure an angular velocity of rotation of thedisplay unit 150 using the gyro sensor and detect the angular velocity as a change rate of the orientation of thedisplay unit 150. The change rate of the orientation of thedisplay unit 150 detected in this manner can be utilized for detection of the orientation of thedisplay unit 150 at the first detectingunit 151. Note that while the first detectingunit 151 is integrated with thedisplay unit 150 in the example illustrated inFIG. 2 , the first detectingunit 151 may be configured separately from thedisplay unit 150. - The second detecting
unit 171 detects the orientation of the terminal 170. For example, the second detectingunit 171 may include a geomagnetic sensor and measure geomagnetic data indicating orientation of geomagnetism in a coordinate system of the terminal 170 using the geomagnetic sensor. The geomagnetic data can be utilized for, for example, a direction the terminal 170 faces (orientation in a horizontal direction). Further, the second detectingunit 171 may include an acceleration sensor, measure acceleration applied to the terminal 170 using the acceleration sensor and detect change of the orientation of the terminal 170 based on the acceleration. The acceleration can be utilized for, for example, tilt of the terminal 170 (orientation in a vertical direction). - Further, the second detecting
unit 171 may further include a gyro sensor, measure an angular velocity of rotation of the terminal 170 using the gyro sensor and detect the angular velocity as a change rate of the orientation of the terminal 170. The change rate of the orientation of the terminal 170 detected in this manner can be utilized for detection of the orientation of the terminal 170 at the second detectingunit 171. Note that while the second detectingunit 171 is integrated with the terminal 170 in the example illustrated inFIG. 2 , the second detectingunit 171 may be configured separately from the second detectingunit 171. - The configuration example of the functions of the
information processing system 1 according to the embodiment of the present disclosure has been described above. - Details of functions of the
information processing system 1 according to the embodiment of the present disclosure will be described next. First, a first example will be described with reference toFIG. 3 toFIG. 9 .FIG. 3 is a diagram illustrating an example of a case where content is scrolled in the first example. As illustrated inFIG. 3 , the user wears thedisplay unit 150 on his head and has the terminal 170 with part of his body (for example, a hand). Thedisplay control unit 113 displays the screen 50-A1 at thedisplay unit 150. The screen 50-A1 includes content C1 to C7. - The first acquiring
unit 111 acquires the orientation u of the display unit detected by the first detectingunit 151. Further, the second acquiringunit 112 acquires the orientation t of the terminal detected by the second detectingunit 171. Here, thedisplay control unit 113 scrolls content according to the orientation u of the display unit. By this means, it is possible to allow the user to easily scroll the content. Further, the user can scroll the content intuitively. - The orientation t of the terminal can be also used by the
display control unit 113. In such a case, for example, thedisplay control unit 113 may perform predetermined control according to the orientation t of the terminal. By this means, because it becomes possible to also use the orientation t of the terminal, for example, even if it is difficult to adjust the orientation u of the display unit, it is possible to perform input by utilizing the orientation t of the terminal. Possible examples of the case where it is difficult to adjust the orientation u of the display unit include a case where it is difficult for the user to maintain the orientation of his head to fixed orientation for a long period of time. For example, thedisplay control unit 113 may scroll the content according to a relationship between the orientation u of the display unit and the orientation t of the terminal. - While the relationship between the orientation u of the display unit and the orientation t of the terminal may be any relationship, as one example, the
display control unit 113 may scroll the content according to the orientation t of the terminal which is relative to the orientation u of the display unit. More specifically, thedisplay control unit 113 may scroll the content in opposite orientation to the orientation t of the terminal which is relative to the orientation u of the display unit.FIG. 3 illustrates a case where the orientation t of the terminal which is relative to the orientation u of the display unit is leftward when viewed from the user. At this time, thedisplay control unit 113 may scroll the content to the right, which is opposite to the left, in order for the user to easily view a left side of the screen 50-A1. - Scroll speed of the content is not particularly limited, and, for example, the
display control unit 113 may control the scroll speed of the content according to an angular difference between the orientation u of the display unit and the orientation t of the terminal. More specifically, thedisplay control unit 113 may increase the scroll speed of the content for a larger angular difference between the orientation u of the display unit and the orientation t of the terminal. This control allows the user to adjust the scroll speed of the content intuitively. - Note that while a case has been described in the above description where the orientation t of the terminal acquired by the second acquiring
unit 112 is used by thedisplay control unit 113, it is also possible to use some kind of reference orientation in place of the orientation t of the terminal. For example, thedisplay control unit 113 may scroll the content according to the orientation u of the display unit which is relative to the reference orientation. More specifically, thedisplay control unit 113 may scroll the content in opposite orientation to the orientation u of the display unit which is relative to the reference orientation. The reference orientation may be set in advance by the user or may be set by thecontrol unit 110. - Here, the
display control unit 113 only has to stop scrolling of the content when the orientation u of the display unit matches the orientation t of the terminal or the orientation falls within a predetermined range.FIG. 4 is a diagram illustrating an example of a case where scrolling is stopped in the first example. Referring toFIG. 4 , because the orientation u of the display unit matches the orientation t of the terminal after the content is scrolled, a screen 50-A2 including content C4 to C10 is displayed by thedisplay control unit 113 and scrolling of the content is stopped in a state where the screen 50-A2 is displayed. - Further, while focus is placed on content C7 in the center among content C4 to C10 in the example illustrated in
FIG. 4 , content on which focus is placed is not particularly limited. In this state, for example, when the user performs predetermined operation to the terminal 170, thedisplay control unit 113 may select content on which focus is placed. While the predetermined operation is not particularly limited, the predetermined operation may include operation of the user tapping theterminal 170. -
FIG. 5 is a diagram illustrating an example of operation for selecting the content C7 in the first example. Referring toFIG. 5 , when the user taps the terminal 170 in a state where focus is placed on the content C7, thedisplay control unit 113 selects the content C7 on which focus is placed. Thedisplay control unit 113 perform any control on the content C7 after the content C7 is selected. For example, as a result of the content C7 being selected, a selection result screen including enlarged content C7 may be displayed. -
FIG. 6 is a diagram illustrating an example of a case where a selection result screen 50-A3 is displayed as a result of the content C7 being selected in the first example. Referring toFIG. 6 , thedisplay control unit 113 displays the selection result screen 50-A3 including the enlarged content C7 at thedisplay unit 150. Further, for example, when the user performs predetermined operation on the terminal 170 in a state where the selection result screen 50-A3 is displayed, thedisplay control unit 113 may be able to return the screen to the previous screen. While the predetermined operation is not particularly limited, the predetermined operation may include operation of the user holding down a button on the terminal 170 for a long period of time. -
FIG. 7 is a diagram illustrating an example of operation for returning display from the selection result screen 50-A3 to the previous screen 50-A2 in the first example. Further,FIG. 8 is a diagram illustrating an example of a case where display is returned from the selection result screen 50-A3 to the previous screen 50-A2 in the first example. Referring toFIG. 7 andFIG. 8 , when the user holds down a button on the terminal 170 for a long period of time in a state where the enlarged content C7 is displayed, thedisplay control unit 113 displays the previous screen 50-A2 at thedisplay unit 150. -
FIG. 9 is a flowchart illustrating flow of operation of thedisplay control device 10 in the first example. - Note that the example illustrated in
FIG. 9 is merely one example of the flow of the operation of thedisplay control device 10 in the first example. Therefore, the flow of the operation of thedisplay control device 10 in the first example is not limited to the example illustrated inFIG. 9 . - As illustrated in
FIG. 9 , when the orientation u of the display unit is detected by the first detectingunit 151, the orientation u of the display unit is input to thecontrol unit 110 via thefirst input unit 131, and the orientation u of the display unit is acquired by the first acquiring unit 111 (S11). Subsequently, when the orientation t of the terminal is detected by the second detectingunit 171, the orientation t of the terminal is input to thecontrol unit 110 via thesecond input unit 132, and the orientation u of the display unit is acquired by the second acquiring unit 112 (S12). - The
display control unit 113 scrolls the content according to the orientation t of the terminal which is relative to the orientation u of the display unit (S13). Note that after the operation of S13 is finished, thecontrol unit 110 may return to the operation of S11 again or may finish the operation. The first example has been described above. - A second example will be described next with reference to
FIG. 10 toFIG. 14 .FIG. 10 is a diagram illustrating an example of a case where focus is moved in the second example. As illustrated inFIG. 10 , the user wears thedisplay unit 150 on his head and has the terminal 170 with part of his body (for example, a hand). Thedisplay control unit 113 displays a screen 50-B1 at thedisplay unit 150. The screen 50-B1 includes objects B1 to B4. - The first acquiring
unit 111 acquires the orientation u of the display unit detected by the first detectingunit 151. Further, the second acquiringunit 112 acquires the orientation t of the terminal detected by the second detectingunit 171. Here, thedisplay control unit 113 moves focus according to the orientation t of the terminal. By this means, the user can easily move the focus. - The orientation u of the display unit can be also used by the
display control unit 113. In such a case, for example, thedisplay control unit 113 may perform predetermined control according to the orientation u of the display unit. By this means, because it becomes possible to also use the orientation t of the terminal, variations of the operation which can be input by the user can be increased. For example, thedisplay control unit 113 may move the focus according to a relationship between the orientation u of the display unit and the orientation t of the terminal. - While the relationship between the orientation u of the display unit and the orientation t of the terminal may be any relationship, as one example, the
display control unit 113 may move the focus according to the orientation t of the terminal which is relative to the orientation u of the display unit. More specifically, thedisplay control unit 113 may move the focus according to the orientation t of the terminal which is relative to the orientation u of the display unit.FIG. 10 illustrates a case where the orientation t of the terminal which is relative to the orientation u of the display unit is leftward when viewed from the user. At this time, thedisplay control unit 113 may move the focus to a position of an object B1 in an upper left region of the screen 50-B1. - Note that while a case has been described above where the orientation u of the display unit acquired by the first acquiring
unit 111 is used by thedisplay control unit 113, it is also possible to use some kind of reference orientation in place of the orientation u of the display unit. For example, thedisplay control unit 113 may move the focus according to the orientation t of the terminal which is relative to the reference orientation. More specifically, thedisplay control unit 113 may move the focus according to the orientation t of the terminal which is relative to the reference orientation. The reference orientation may be set in advance by the user or may be set by thecontrol unit 110. - When, for example, the user performs predetermined operation on the terminal 170 in a state where focus is placed on the position of the object B1, the
display control unit 113 may be able to select the object on which focus is placed. The predetermined operation is not particularly limited, and may include operation of the user tapping theterminal 170. -
FIG. 11 is a diagram illustrating an example of a case where the object B1 is selected in a second example. Referring toFIG. 11 , when the user taps the terminal 170 in a state where focus is placed on the object B1, thedisplay control unit 113 selects the object B1 on which focus is placed. After the object B1 is selected, thedisplay control unit 113 may perform any control on the object B1. For example, as a result of the object B1 being selected, a selection result screen including content corresponding to the object B1 may be displayed. -
FIG. 12 is a diagram illustrating an example of a case where a selection result screen 50-B2 is displayed as a result of the object B1 being selected in the second example. Referring toFIG. 12 , thedisplay control unit 113 displays the selection result screen 50-B2 including content C11 corresponding to the object B1 at thedisplay unit 150. Further, when, for example, the user performs predetermined operation on the terminal 170 in a state where the selection result screen 50-B2 is displayed, thedisplay control unit 113 may be able to return the screen to the previous screen. The predetermined operation is not particularly limited, and may include operation of the user holding down a button on the terminal 170 for a long period of time. -
FIG. 13 is a diagram illustrating an example of a case where display is returned from the selection result screen 50-B2 to the previous screen 50-B1 in the second example. Referring toFIG. 12 andFIG. 13 , when the user holds down a button for a long period of time on the terminal 170 in a state where the content C11 corresponding to the object B1 is displayed, thedisplay control unit 113 displays the previous screen 50-B1. -
FIG. 14 is a flowchart illustrating flow of operation of thedisplay control device 10 in the second example. Note that the example illustrated inFIG. 14 is merely an example of flow of the operation of thedisplay control device 10 in the second example. Therefore, the flow of the operation of thedisplay control device 10 in the second example is not limited to the example illustrated inFIG. 14 . - As illustrated in
FIG. 14 , when the orientation u of the display unit is detected by the first detectingunit 151, the orientation u of the display unit is input to thecontrol unit 110 via thefirst input unit 131, and the orientation u of the display unit is acquired by the first acquiring unit 111 (S21). Subsequently, when the orientation t of the terminal is detected by the second detectingunit 171, the orientation t of the terminal is input to thecontrol unit 110 via thesecond input unit 132, and the orientation u of the display unit is acquired by the second acquiring unit 112 (S22). - The
display control unit 113 selects an object according to the orientation t of the terminal which is relative to the orientation u of the display unit (S23). Note that after the operation of S23 is finished, thecontrol unit 110 may return to the operation of S21 again or may finish the operation. The second example has been described above. - Subsequently, referring to
FIG. 15 andFIG. 16 , a third example will be described.FIG. 15 is a diagram illustrating an example of a case where focus is moved in the third example. As illustrated inFIG. 15 , the user wears thedisplay unit 150 on his head and has the terminal 160 with part of his body (for example, a hand). Thedisplay control unit 113 displays the screen 50-C1 at thedisplay unit 150. The screen 50-C1 includes content C1 to C7 and objects B1 to B4. - The first acquiring
unit 111 acquires the orientation u of the display unit detected by the first detectingunit 151. Further, the second acquiringunit 112 acquires the orientation t of the terminal detected by the second detectingunit 171. Here, thedisplay control unit 113 scrolls content according to the orientation u of the display unit. Further, thedisplay control unit 113 selects an object according to the orientation t of the terminal. By this means, it becomes possible to easily scroll content and easily select an object. - For example, the
display control unit 113 may scroll the content according to the orientation u of the display unit which is relative to reference orientation. More specifically, thedisplay control unit 113 may scroll the content in opposite orientation to the orientation u of the display unit which is relative to the reference orientation. The reference orientation may be set in advance by the user or may be set by thecontrol unit 110. - In a similar manner, for example, the
display control unit 113 may select an object according to the orientation t of the terminal which is relative to reference orientation. More specifically, thedisplay control unit 113 may select an object according to the orientation t which is relative to the reference orientation. The reference orientation may be set in advance by the user or may be set by thecontrol unit 110. -
FIG. 16 is a flowchart illustrating flow of operation of thedisplay control device 10 in the third example. Note that the example illustrated inFIG. 16 is merely an example of the flow of the operation of thedisplay control device 10 in the third example. Therefore, the flow of the operation of thedisplay control device 10 in the third example is not limited to the example illustrated inFIG. 16 . - As illustrated in
FIG. 16 , when the orientation u of the display unit is detected by the first detectingunit 151, the orientation u of the display unit is input to thecontrol unit 110 via thefirst input unit 131, and the orientation u of the display unit is acquired by the first acquiring unit 111 (S31). Subsequently, when the orientation t of the terminal is detected by the second detectingunit 171, the orientation t of the terminal is input to thecontrol unit 110 via thesecond input unit 132, and the orientation u of the display unit is acquired by the second acquiring unit 112 (S32). - The
display control unit 113 scrolls content according to the orientation u of the display unit (S33). Further, thedisplay control unit 113 selects an object according to the orientation t of the terminal (S34). Note that after the operation of S34 is finished, thecontrol unit 110 may return to the operation of S31 again or may finish the operation. The third example has been described above. - In the first example and the third example, a case has been described where the
display control unit 113 scrolls the content according to the orientation u of the display unit. However, it is assumed that the user tries to change the orientation of thedisplay unit 150 for the purpose other than the purpose of scrolling the content. Therefore, thedisplay control unit 113 may determine whether or not to scroll the content based on whether or not the user inputs predetermined operation. While a case will been described in the following description where the predetermined operation is touch operation, the predetermined operation may be any operation. -
FIG. 17 is a diagram illustrating a modification of the first example and the third example. As illustrated inFIG. 17 , the user wears thedisplay unit 150 on his head and has the terminal 170 with part of his body (for example, a hand). Thedisplay control unit 113 displays a screen 50-D1 at thedisplay unit 150. The screen 50-D1 includes content C1 to C7. - The first acquiring
unit 111 acquires the orientation u of the display unit detected by the first detectingunit 151. Further, the second acquiringunit 112 acquires the orientation t of the terminal detected by the second detectingunit 171. Here, as illustrated in the screen 50-D1, thedisplay control unit 113 may determine not to scroll the content while the user does not input touch operation. Meanwhile, as illustrated in a screen 50-D2, thedisplay control unit 113 may determine to scroll the content while the user inputs touch operation. - Note that the operation performed while the user does not input touch operation may be replaced with the operation performed while the user inputs touch operation. That is, the
display control unit 113 may determine not to scroll the content while the user inputs touch operation, while thedisplay control unit 113 may determine to scroll the content while the user does not input touch operation. - Further, in the second example and the third example, an example has been described where the
display control unit 113 selects an object according to the orientation t of the terminal. However, it is assumed that the user tries to change the orientation of the terminal 170 for the purpose other than purpose of selecting an object. Therefore, thedisplay control unit 113 may determine whether or not to select an object based on whether or not the user inputs predetermined operation. - For example, the
display control unit 113 may determine not to select an object while the user does not input touch operation. Meanwhile, thedisplay control unit 113 may determine to select an object while the user inputs touch operation. - Note that the operation performed while the user inputs touch operation may be replaced with the operation performed while the user does not input touch operation. That is, the
display control unit 113 may determine not to select an object while the user inputs touch operation, while thedisplay control unit 113 may determine to select an object while the user does not input touch operation. - Details of the functions of the
information processing system 1 according to the embodiment of the present disclosure has been described above. - Subsequently, a hardware configuration example of the
display control device 10 according to the embodiment of the present disclosure will be described.FIG. 18 is a diagram illustrating a hardware configuration example of thedisplay control device 10 according to the embodiment of the present disclosure. However, the hardware configuration example illustrated inFIG. 18 is merely an example of a hardware configuration of thedisplay control device 10. Therefore, the hardware configuration of thedisplay control device 10 is not limited to the example illustrated inFIG. 18 . - As illustrated in
FIG. 18 , thedisplay control device 10 includes a central processing unit (CPU) 901, a read-only memory (ROM) 902, a random-access memory (RAM) 903, aninput device 908, anoutput device 910, astorage device 911, and adrive 912. - The
CPU 901, which functions as a calculation processing device and a control device, controls the overall operation of thedisplay control device 10 based on various programs. Further, theCPU 901 may be a microprocessor. TheROM 902 stores programs, calculation parameters and the like used by theCPU 901. TheRAM 903 temporarily stores the programs to be used during execution by theCPU 901, and parameters that appropriately change during that execution. These units are connected to each other by a host bus, which is configured from a CPU bus or the like. - The
input device 908 receives input of the orientation of thedisplay unit 150 detected by the first detectingunit 151 and the orientation of the terminal 170 detected by the second detectingunit 171. The orientation of thedisplay unit 150 and the orientation of the terminal 170 received at theinput device 908 is output to theCPU 901. Further, theinput device 908 may receive input of a change amount of the orientation of thedisplay unit 150 detected by the first detectingunit 151 and a change amount of the orientation of the terminal 170 detected by the second detectingunit 171 and output the change amounts to theCPU 901. - The
output device 910 provides output data to thedisplay unit 150. For example, theoutput device 910 provides display data to thedisplay unit 150 under the control of theCPU 901. If thedisplay unit 150 is configured from an audio output device, theoutput device 910 provides audio data to thedisplay unit 150 under the control of theCPU 901. - The
storage device 911 is a device used to store data that is configured as an example of thestorage unit 120 in thedisplay control device 10. Thestorage device 911 may also include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium and the like. Thisstorage device 911 stores programs executed by theCPU 901 and various kinds of data. - The
drive 912 is a storage medium reader/writer, which may be built-in or externally attached to thedisplay control device 10. Thedrive 912 reads information recorded on aremovable storage medium 71, such as a mounted magnetic disk, optical disc, magneto-optical disk, or semiconductor memory, and outputs the read information to theRAM 903. Further, thedrive 912 can also write information to theremovable storage medium 71. - A hardware configuration example of the
display control device 10 according to an embodiment of the present disclosure was described above. - As described above, according to the embodiment of the present disclosure, there is provided a
display control device 10 which includes a first acquiringunit 111 configured to acquire orientation of adisplay unit 150 detected by the first detectingunit 151, and adisplay control unit 113 configured to display content at thedisplay unit 150, wherein thedisplay control unit 113 scrolls the content according to the orientation of thedisplay unit 150. According to this configuration, it is possible to provide a technique for allowing a user to easily scroll the content. - Note that while details of the preferred embodiment of the present disclosure has been described with reference to the appended drawings, the technical scope of the present disclosure is not limited to these examples. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
- Further, a program for realizing the same functions as the units included in the above-described
display control device 10 can also recreate the hardware, such as the CPU, the ROM, and the RAM, that is included in the computer. In addition, a computer-readable recording medium having this program recorded thereon can also be provided. - Additionally, the present technology may also be configured as below.
- (1)
- A display control device including:
- a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit; and
- a display control unit configured to display content at the display unit,
- wherein the display control unit scrolls the content according to the orientation of the display unit.
- (2)
- The display control device according to (1), further including:
- a second acquiring unit configured to acquire orientation of a terminal detected by a second detecting unit,
- wherein the display control unit performs predetermined control according to the orientation of the terminal.
- (3)
- The display control device according to (2),
- wherein the display control unit scrolls the content according to a relationship between the orientation of the display unit and the orientation of the terminal.
- (4)
- The display control device according to (3),
- wherein the display control unit scrolls the content according to the orientation of the terminal which is relative to the orientation of the display unit.
- (5)
- The display control device according to (4),
- wherein the display control unit controls scroll speed of the content according to an angular difference between the orientation of the display unit and the orientation of the terminal.
- (6)
- The display control device according to any one of (2) to (5),
- wherein the display control unit scrolls the content according to the orientation of the display unit and selects an object based on the orientation of the terminal.
- (7)
- The display control device according to any one of (1) to (6),
- wherein the display control unit determines whether or not to scroll the content based on whether or not a user inputs predetermined operation.
- (8)
- A display control method including:
- acquiring orientation of a display unit detected by a first detecting unit;
- displaying content at the display unit; and
- scrolling the content according to the orientation of the display unit.
- (9)
- A computer-readable recording medium having a program recorded therein, the program causing a computer to function as a display control device including:
- a first acquiring unit configured to acquire orientation of a display unit detected by a first detecting unit; and
- a display control unit configured to display content at the display unit,
- wherein the display control unit scrolls the content according to the orientation of the display unit.
-
- 1 information processing system
- 10 display control device
- 110 control unit
- 111 first acquiring unit
- 112 second acquiring unit
- 113 display control unit
- 120 storage unit
- 131 first input unit
- 132 second input unit
- 150 display unit
- 151 first detecting unit
- 170 terminal
- 171 second detecting unit
- B1 to B4 object
- C1 to C11 content
Claims (11)
1. An information processing system, comprising:
a head mounted device including a display screen configured to display content;
a hand-held remote terminal;
a first sensor configured to detect a first orientation of the display screen;
a second sensor configured to detect a second orientation of the hand-held remote terminal; and
at least one processor configured to:
determine an angular difference, in a first direction, between the detected first orientation of the display screen and the detected second orientation of the hand-held remote terminal, and
control, based on the determined angular difference, the display screen to scroll the displayed content in a second direction.
2. The information processing system according to claim 1 ,
wherein the second sensor includes a geomagnetic sensor, and
wherein the geomagnetic sensor configured to detect the second orientation of the hand-held remote terminal associated with the first direction.
3. The information processing system according to claim 2 , wherein
the second direction is one of a rightward direction in a first plane parallel to a second plane of the display screen or a leftward direction in the first plane parallel to the second plane of the display screen.
4. The display control device according to claim 3 , wherein the at least one processor is further configured to control, based on the determined angular difference, a scroll speed of the scroll of the displayed content.
5. The display control device according to claim 4 , wherein the at least one processor is further configured to increase the scroll speed of the displayed content based on increase in the determined angular difference.
6. The information processing system according to claim 1 ,
wherein the hand-held remote terminal is configured to receive a user touch operation, and
wherein the at least one processor is further configured to control, based on the received user touch operation, the display screen to scroll the displayed content.
7. The information processing system according to claim 6 , wherein
the at least one processor is further configured to:
restrict the scroll of the displayed content based on absence of reception of the user touch operation by the hand-held remote terminal; and
control the display screen to scroll the displayed content for a time period for which the user touch operation is received by the hand-held remote terminal.
8. The information processing system according to claim 1 ,
wherein the hand-held remote terminal includes a button,
wherein the hand-held remote terminal is configured to receive a tap operation on the button, and
wherein the at least one processor is further configured to:
control, based on the received tap operation and a region of focus of the displayed content, the display screen to display a selection result image; and
update, based on the tap operation on the button for a first time period, the display screen to return from the selection result image.
9. The information processing system according to claim 1 , wherein
the hand-held remote terminal is a smartphone including the at least one processor and the second sensor.
10. An information processing method, comprising:
in an information processing system that includes a display screen and a hand-held remote terminal:
displaying content on the display screen;
detecting, by a first sensor, a first orientation of the display screen;
detecting, by a second sensor, a second orientation of the hand-held remote terminal; and
determining an angular difference, in a first direction, between the detected first orientation of the display screen and the detected second orientation of the hand-held remote terminal, and
controlling, based on the determined angular difference, the display screen to scroll the displayed content in a second direction.
11. A non-transitory computer-readable medium having stored thereon computer-executable instructions that, when executed by a processor, cause an information processing system to execute operations, the operations comprising:
displaying content on a display screen;
detecting, by a first sensor, a first orientation of the display screen;
detecting, by a second sensor, a second orientation of a hand-held remote terminal; and
determining an angular difference, in a first direction, between the detected first orientation of the display screen and the detected second orientation of the hand-held remote terminal, and
controlling, based on the determined angular difference, the display screen to scroll the displayed content in a second direction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/904,654 US20180181273A1 (en) | 2013-05-15 | 2018-02-26 | Display control device, display control method, and recording medium |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-102885 | 2013-05-15 | ||
JP2013102885 | 2013-05-15 | ||
US14/889,569 US9940009B2 (en) | 2013-05-15 | 2014-03-18 | Display control device for scrolling of content based on sensor data |
PCT/JP2014/057412 WO2014185146A1 (en) | 2013-05-15 | 2014-03-18 | Display control device, display control method, and recording medium |
US15/904,654 US20180181273A1 (en) | 2013-05-15 | 2018-02-26 | Display control device, display control method, and recording medium |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/889,569 Continuation US9940009B2 (en) | 2013-05-15 | 2014-03-18 | Display control device for scrolling of content based on sensor data |
PCT/JP2014/057412 Continuation WO2014185146A1 (en) | 2013-05-15 | 2014-03-18 | Display control device, display control method, and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180181273A1 true US20180181273A1 (en) | 2018-06-28 |
Family
ID=51898131
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/889,569 Active US9940009B2 (en) | 2013-05-15 | 2014-03-18 | Display control device for scrolling of content based on sensor data |
US15/904,654 Abandoned US20180181273A1 (en) | 2013-05-15 | 2018-02-26 | Display control device, display control method, and recording medium |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/889,569 Active US9940009B2 (en) | 2013-05-15 | 2014-03-18 | Display control device for scrolling of content based on sensor data |
Country Status (5)
Country | Link |
---|---|
US (2) | US9940009B2 (en) |
EP (1) | EP2998849A4 (en) |
JP (1) | JP6308214B2 (en) |
CN (1) | CN105247466B (en) |
WO (1) | WO2014185146A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10451874B2 (en) * | 2013-09-25 | 2019-10-22 | Seiko Epson Corporation | Image display device, method of controlling image display device, computer program, and image display system |
US12007563B2 (en) | 2020-10-27 | 2024-06-11 | Fujifilm Corporation | Display control device, display control method, and display control program |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160291687A1 (en) * | 2013-05-21 | 2016-10-06 | Sony Corporation | Display control device, display control method, and recording medium |
JP6459516B2 (en) * | 2015-01-06 | 2019-01-30 | セイコーエプソン株式会社 | Display system, program, server device, and display device control method |
WO2016149873A1 (en) * | 2015-03-20 | 2016-09-29 | 华为技术有限公司 | Intelligent interaction method, equipment and system |
GB2543019A (en) * | 2015-07-23 | 2017-04-12 | Muzaffar Saj | Virtual reality headset user input system |
US11010972B2 (en) | 2015-12-11 | 2021-05-18 | Google Llc | Context sensitive user interface activation in an augmented and/or virtual reality environment |
US10509487B2 (en) * | 2016-05-11 | 2019-12-17 | Google Llc | Combining gyromouse input and touch input for navigation in an augmented and/or virtual reality environment |
US10722800B2 (en) | 2016-05-16 | 2020-07-28 | Google Llc | Co-presence handling in virtual reality |
US10545584B2 (en) * | 2016-05-17 | 2020-01-28 | Google Llc | Virtual/augmented reality input device |
JP2018037034A (en) * | 2016-09-02 | 2018-03-08 | 株式会社タカラトミー | Information processing system |
JP2018101019A (en) | 2016-12-19 | 2018-06-28 | セイコーエプソン株式会社 | Display unit and method for controlling display unit |
US10782793B2 (en) * | 2017-08-10 | 2020-09-22 | Google Llc | Context-sensitive hand interaction |
CN119556794A (en) | 2018-06-21 | 2025-03-04 | 奇跃公司 | Method and apparatus for providing input for a head mounted image display device |
JP2021119431A (en) * | 2020-01-30 | 2021-08-12 | セイコーエプソン株式会社 | Display system, controller, display system control method and program |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130342569A1 (en) * | 2012-06-22 | 2013-12-26 | Nokia Corporation | Method and apparatus for augmenting an index generated by a near eye display |
US20160292922A1 (en) * | 2013-05-21 | 2016-10-06 | Sony Corporation | Display control device, display control method, and recording medium |
US20160378204A1 (en) * | 2015-06-24 | 2016-12-29 | Google Inc. | System for tracking a handheld device in an augmented and/or virtual reality environment |
US20170059871A1 (en) * | 2015-08-28 | 2017-03-02 | Tomy Company Ltd. | Information processing device including head mounted display |
US20170206673A1 (en) * | 2014-08-05 | 2017-07-20 | Sony Corporation | Information processing apparatus, information processing method, and image display system |
US20170256029A1 (en) * | 2016-03-02 | 2017-09-07 | Osterhout Group, Inc. | Optical systems for head-worn computers |
US20170322622A1 (en) * | 2016-05-09 | 2017-11-09 | Lg Electronics Inc. | Head mounted display device and method for controlling the same |
US20170330387A1 (en) * | 2016-05-13 | 2017-11-16 | Google Inc. | Methods and apparatus to align components in virtual reality environments |
US20180005421A1 (en) * | 2016-06-29 | 2018-01-04 | Lg Electronics Inc. | Glasses-type mobile terminal and method of operating the same |
US20180011555A1 (en) * | 2016-07-06 | 2018-01-11 | Samsung Electronics Co., Ltd. | Electronic device, wearable device, and method for controlling screen of electronic device |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0795498A (en) * | 1993-09-24 | 1995-04-07 | Sony Corp | Glasses type display |
US6624824B1 (en) * | 1996-04-30 | 2003-09-23 | Sun Microsystems, Inc. | Tilt-scrolling on the sunpad |
WO2001056007A1 (en) * | 2000-01-28 | 2001-08-02 | Intersense, Inc. | Self-referenced tracking |
JP2003280785A (en) | 2002-03-26 | 2003-10-02 | Sony Corp | Image display processor, image display processing method and computer program |
US20100045705A1 (en) * | 2006-03-30 | 2010-02-25 | Roel Vertegaal | Interaction techniques for flexible displays |
US20090040308A1 (en) | 2007-01-15 | 2009-02-12 | Igor Temovskiy | Image orientation correction method and system |
US9770611B2 (en) * | 2007-05-03 | 2017-09-26 | 3M Innovative Properties Company | Maintenance-free anti-fog respirator |
JP2009021914A (en) * | 2007-07-13 | 2009-01-29 | Sony Corp | Imaging display system, imaging display device, and control method of imaging display device |
KR20100035924A (en) * | 2008-09-29 | 2010-04-07 | 삼성전자주식회사 | Display system having display apparatus and external input apparatus, and control method thereof |
US8957835B2 (en) * | 2008-09-30 | 2015-02-17 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
JP5446624B2 (en) * | 2009-09-07 | 2014-03-19 | ソニー株式会社 | Information display device, information display method, and program |
JP2011082781A (en) | 2009-10-07 | 2011-04-21 | Nikon Corp | Head mount display device |
US9134799B2 (en) * | 2010-07-16 | 2015-09-15 | Qualcomm Incorporated | Interacting with a projected user interface using orientation sensors |
US20120064951A1 (en) * | 2010-09-13 | 2012-03-15 | Sony Ericsson Mobile Communications Ab | Hands-Free Control of Mobile Communication Device Based on Head Movement |
US8706170B2 (en) * | 2010-09-20 | 2014-04-22 | Kopin Corporation | Miniature communications gateway for head mounted display |
KR101823148B1 (en) * | 2010-12-30 | 2018-01-30 | 주식회사 알티캐스트 | Mobile terminal and method for controlling screen in display device using the same |
US20120188148A1 (en) * | 2011-01-24 | 2012-07-26 | Microvision, Inc. | Head Mounted Meta-Display System |
EP2485119A3 (en) * | 2011-02-02 | 2012-12-12 | Nintendo Co., Ltd. | Spatially-correlated multi-display human-machine interface |
JP5718078B2 (en) | 2011-02-04 | 2015-05-13 | 光精工株式会社 | Closed forging device and closed forging method |
JP5757791B2 (en) | 2011-06-03 | 2015-07-29 | オリンパス株式会社 | Input system, head-mounted display device, information terminal device, and program |
JP5800602B2 (en) | 2011-06-29 | 2015-10-28 | オリンパス株式会社 | Information processing system, portable electronic device, program, and information storage medium |
US9285592B2 (en) * | 2011-08-18 | 2016-03-15 | Google Inc. | Wearable device with input and output structures |
EP2893706B1 (en) * | 2012-09-04 | 2020-01-08 | Synamedia Limited | Augmented reality for video system |
US20140198034A1 (en) * | 2013-01-14 | 2014-07-17 | Thalmic Labs Inc. | Muscle interface device and method for interacting with content displayed on wearable head mounted displays |
US9213403B1 (en) * | 2013-03-27 | 2015-12-15 | Google Inc. | Methods to pan, zoom, crop, and proportionally move on a head mountable display |
TW201447737A (en) * | 2013-06-13 | 2014-12-16 | Compal Electronics Inc | Method and system for operating display device |
US9383818B2 (en) * | 2013-12-27 | 2016-07-05 | Google Technology Holdings LLC | Method and system for tilt-based actuation |
KR102302876B1 (en) * | 2014-06-23 | 2021-09-17 | 삼성전자주식회사 | Bioelectrode and biosignal processing apparatus and method using the same |
KR102227087B1 (en) * | 2014-07-08 | 2021-03-12 | 엘지전자 주식회사 | Wearable glass-type device and control method of the wearable glass-type device |
US9910504B2 (en) * | 2014-08-21 | 2018-03-06 | Samsung Electronics Co., Ltd. | Sensor based UI in HMD incorporating light turning element |
KR102183212B1 (en) * | 2014-11-18 | 2020-11-25 | 삼성전자주식회사 | Method for controlling display and an electronic device thereof |
KR20160133328A (en) * | 2015-05-12 | 2016-11-22 | 삼성전자주식회사 | Remote control method and device using wearable device |
US10101803B2 (en) * | 2015-08-26 | 2018-10-16 | Google Llc | Dynamic switching and merging of head, gesture and touch input in virtual reality |
-
2014
- 2014-03-18 JP JP2015516976A patent/JP6308214B2/en active Active
- 2014-03-18 CN CN201480025709.5A patent/CN105247466B/en active Active
- 2014-03-18 US US14/889,569 patent/US9940009B2/en active Active
- 2014-03-18 WO PCT/JP2014/057412 patent/WO2014185146A1/en active Application Filing
- 2014-03-18 EP EP14797308.5A patent/EP2998849A4/en not_active Ceased
-
2018
- 2018-02-26 US US15/904,654 patent/US20180181273A1/en not_active Abandoned
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130342569A1 (en) * | 2012-06-22 | 2013-12-26 | Nokia Corporation | Method and apparatus for augmenting an index generated by a near eye display |
US20160292922A1 (en) * | 2013-05-21 | 2016-10-06 | Sony Corporation | Display control device, display control method, and recording medium |
US20170206673A1 (en) * | 2014-08-05 | 2017-07-20 | Sony Corporation | Information processing apparatus, information processing method, and image display system |
US20160378204A1 (en) * | 2015-06-24 | 2016-12-29 | Google Inc. | System for tracking a handheld device in an augmented and/or virtual reality environment |
US20170059871A1 (en) * | 2015-08-28 | 2017-03-02 | Tomy Company Ltd. | Information processing device including head mounted display |
US20170256029A1 (en) * | 2016-03-02 | 2017-09-07 | Osterhout Group, Inc. | Optical systems for head-worn computers |
US20170322622A1 (en) * | 2016-05-09 | 2017-11-09 | Lg Electronics Inc. | Head mounted display device and method for controlling the same |
US20170330387A1 (en) * | 2016-05-13 | 2017-11-16 | Google Inc. | Methods and apparatus to align components in virtual reality environments |
US20180005421A1 (en) * | 2016-06-29 | 2018-01-04 | Lg Electronics Inc. | Glasses-type mobile terminal and method of operating the same |
US20180011555A1 (en) * | 2016-07-06 | 2018-01-11 | Samsung Electronics Co., Ltd. | Electronic device, wearable device, and method for controlling screen of electronic device |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10451874B2 (en) * | 2013-09-25 | 2019-10-22 | Seiko Epson Corporation | Image display device, method of controlling image display device, computer program, and image display system |
US12007563B2 (en) | 2020-10-27 | 2024-06-11 | Fujifilm Corporation | Display control device, display control method, and display control program |
US12332440B2 (en) | 2020-10-27 | 2025-06-17 | Fujifilm Corporation | Display control device, display control method, and display control program |
Also Published As
Publication number | Publication date |
---|---|
US9940009B2 (en) | 2018-04-10 |
WO2014185146A1 (en) | 2014-11-20 |
EP2998849A1 (en) | 2016-03-23 |
US20160085403A1 (en) | 2016-03-24 |
CN105247466A (en) | 2016-01-13 |
JPWO2014185146A1 (en) | 2017-02-23 |
JP6308214B2 (en) | 2018-04-11 |
CN105247466B (en) | 2019-07-26 |
EP2998849A4 (en) | 2017-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180181273A1 (en) | Display control device, display control method, and recording medium | |
KR101885131B1 (en) | Method and apparatus for screen scroll of display apparatus | |
KR102180961B1 (en) | Method for processing input and an electronic device thereof | |
KR102161510B1 (en) | Portable device and controlling method thereof | |
US9798443B1 (en) | Approaches for seamlessly launching applications | |
US9753607B2 (en) | Electronic device, control method, and control program | |
JP5962547B2 (en) | Information processing apparatus, information processing method, and program | |
US20120281129A1 (en) | Camera control | |
US20160088060A1 (en) | Gesture navigation for secondary user interface | |
KR20160110975A (en) | Interaction with a computing device via movement of a portion of a user interface | |
JP2015508211A (en) | Method and apparatus for controlling a screen by tracking a user's head through a camera module and computer-readable recording medium thereof | |
JPWO2015170520A1 (en) | Information processing system and information processing method | |
CA2782786A1 (en) | Electronic device interface | |
WO2013168347A1 (en) | Information processing apparatus, method for controlling the information processing apparatus, and storage medium | |
US20160110057A1 (en) | Remote controller apparatus and control method thereof | |
KR20150025214A (en) | Method for displaying visual object on video, machine-readable storage medium and electronic device | |
KR102760777B1 (en) | Wearable device and method for using external object as controller | |
KR20170049991A (en) | Method for providing user interaction based on force touch and electronic device using the same | |
KR20160035865A (en) | Apparatus and method for identifying an object | |
JP6756103B2 (en) | Electronic devices, display systems, display devices, imaging devices, display control methods and programs | |
US10013024B2 (en) | Method and apparatus for interacting with a head mounted display | |
KR20160106315A (en) | Method and storage medium, mobile terminal for controlling screen | |
KR20140105352A (en) | Context awareness based screen scroll method, machine-readable storage medium and terminal | |
US11036287B2 (en) | Electronic device, control method for electronic device, and non-transitory computer readable medium | |
JP2016053769A (en) | Information processing device, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |