WO2016013269A1 - 画像表示装置及び画像表示方法、並びにコンピューター・プログラム - Google Patents
画像表示装置及び画像表示方法、並びにコンピューター・プログラム Download PDFInfo
- Publication number
- WO2016013269A1 WO2016013269A1 PCT/JP2015/062929 JP2015062929W WO2016013269A1 WO 2016013269 A1 WO2016013269 A1 WO 2016013269A1 JP 2015062929 W JP2015062929 W JP 2015062929W WO 2016013269 A1 WO2016013269 A1 WO 2016013269A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- head
- display
- image
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 53
- 238000004590 computer program Methods 0.000 title claims description 14
- 238000012545 processing Methods 0.000 claims abstract description 46
- 238000001514 detection method Methods 0.000 claims description 31
- 230000008569 process Effects 0.000 claims description 23
- 230000004044 response Effects 0.000 claims description 23
- 238000003384 imaging method Methods 0.000 claims description 6
- 210000003128 head Anatomy 0.000 description 51
- 238000005516 engineering process Methods 0.000 description 23
- 238000004891 communication Methods 0.000 description 19
- 230000006870 function Effects 0.000 description 13
- 230000007704 transition Effects 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 238000005401 electroluminescence Methods 0.000 description 6
- 238000003860 storage Methods 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000004886 head movement Effects 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000003183 myoelectrical effect Effects 0.000 description 4
- 210000001525 retina Anatomy 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 3
- MWUXSHHQAYIFBG-UHFFFAOYSA-N nitrogen oxide Inorganic materials O=[N] MWUXSHHQAYIFBG-UHFFFAOYSA-N 0.000 description 3
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 230000008451 emotion Effects 0.000 description 2
- 238000007654 immersion Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 239000012855 volatile organic compound Substances 0.000 description 2
- SNICXCGAKADSCV-JTQLQIEISA-N (-)-Nicotine Chemical compound CN1CCC[C@H]1C1=CC=CN=C1 SNICXCGAKADSCV-JTQLQIEISA-N 0.000 description 1
- WBMKMLWMIQUJDP-STHHAXOLSA-N (4R,4aS,7aR,12bS)-4a,9-dihydroxy-3-prop-2-ynyl-2,4,5,6,7a,13-hexahydro-1H-4,12-methanobenzofuro[3,2-e]isoquinolin-7-one hydrochloride Chemical compound Cl.Oc1ccc2C[C@H]3N(CC#C)CC[C@@]45[C@@H](Oc1c24)C(=O)CC[C@@]35O WBMKMLWMIQUJDP-STHHAXOLSA-N 0.000 description 1
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 206010062519 Poor quality sleep Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000002730 additional effect Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 239000010425 asbestos Substances 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 229910002091 carbon monoxide Inorganic materials 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000002079 cooperative effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- 229930195733 hydrocarbon Natural products 0.000 description 1
- 150000002430 hydrocarbons Chemical class 0.000 description 1
- 150000002500 ions Chemical class 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000006996 mental state Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229960002715 nicotine Drugs 0.000 description 1
- SNICXCGAKADSCV-UHFFFAOYSA-N nicotine Natural products CN1CCCC1C1=CC=CN=C1 SNICXCGAKADSCV-UHFFFAOYSA-N 0.000 description 1
- 229910017464 nitrogen compound Inorganic materials 0.000 description 1
- 150000002830 nitrogen compounds Chemical class 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000013618 particulate matter Substances 0.000 description 1
- 238000006552 photochemical reaction Methods 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- -1 pollen Substances 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 229910052895 riebeckite Inorganic materials 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 210000004243 sweat Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/36—User authentication by graphic or iconic representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
- G09G2340/125—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels wherein one of the images is motion video
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
Definitions
- the technology disclosed in this specification relates to an image display apparatus and an image display method that are worn on the head and used for viewing an image, and a computer program.
- the surrounding scenery is observed by a video see-through method.
- the present invention relates to a possible image display device, an image display method, and a computer program.
- An image display device that is worn on a head or a face and is used for viewing an image, that is, a head-mounted display is known.
- a head-mounted display for example, an image display unit is arranged for each of the left and right eyes, and a user can observe a realistic image by forming an enlarged virtual image of the display image by a virtual image optical system.
- Head-mounted displays are very popular. If mass production progresses in the future, it will spread as mobile phones, smartphones, and portable game machines, and it is expected that one person will have one head-mounted display.
- the head-mounted display is equipped with a high-resolution display panel made of, for example, a liquid crystal or an organic EL (Electro-Luminescence) element as a display unit for the left and right eyes.
- a high-resolution display panel made of, for example, a liquid crystal or an organic EL (Electro-Luminescence) element as a display unit for the left and right eyes.
- the head mounted display can be classified into a transmissive type and a light shielding type.
- the transmissive head-mounted display allows the wearer to observe the surrounding scenery while wearing the head and displaying an image (for example, see Patent Document 1).
- an image for example, see Patent Document 1.
- the user can avoid dangers such as collision with obstacles.
- the light-shielding head-mounted display is configured so as to directly cover the wearer's eyes when worn on the head, which increases the feeling of immersion when viewing images. Enlarge and project the display screen using a virtual image optical system to allow the user to observe it as an enlarged virtual image with an appropriate angle of view, and to reproduce the multi-channel with headphones, reproduce the realism as seen in a movie theater (For example, see Patent Document 2).
- a video see-through system is known that has a built-in camera that can photograph the front of the wearer and displays an external image obtained by photographing. The surrounding landscape can be observed through the image (see, for example, Patent Documents 3 and 4).
- the former transparent head-mounted display is called optical see-through or simply see-through.
- the user authentication screen for the authentication required function is displayed, and the input number information indicating a plurality of different numbers entered by the user on the user authentication screen is compared with the stored personal identification number information.
- Proposals have been made on electronic devices that include a pressing amount detection type touch panel display for releasing the lock (see, for example, Patent Document 5).
- the head-mounted display if users are limited such as personal use, it is necessary to perform an authentication process when starting use.
- an eyeglass-type or head-mounted image display device that permits display of an image on a display on the condition that biometric authentication such as iris authentication, retina authentication, facial copying (morphing), or skeleton authentication has been performed. Proposals have been made (see, for example, Patent Document 6).
- an authentication screen is generally displayed when starting use, and a normal operation screen is not displayed until after an authentication process.
- the user In the case of a light-shielding head-mounted display, the user is in the same state as when both eyes are covered, the field of view is lost, and the user is blindfolded. If the built-in camera is turned on and the captured image is video see-through, the user can regain the field of view and observe the surrounding landscape. However, if the authentication function is set, the video see-through screen cannot be observed unless authentication processing is performed, and the screen is hidden. If the authentication process is delayed, such as a mistake in entering the password, the view will be lost for a long time, and the user will be in a very dangerous state.
- An object of the technology disclosed in the present specification is to provide an excellent image display device and image display method, which can be worn on a head and used for viewing an image, and can observe a surrounding landscape by a video see-through method, and a computer To provide a program.
- An object of the technology disclosed in the present specification is to provide an excellent image display device that can be worn on a head and used for viewing an image, and can suitably perform authentication processing through a screen when starting use.
- An object is to provide an image display method and a computer program.
- An image display device is
- control unit of the image display device detects that the display unit is mounted on a user's head or face by the detection unit.
- the display unit is configured to switch from the non-display state to the display of the captured image.
- the image display device according to claim 1 or 2 further includes an authentication processing unit that authenticates the user.
- the authentication processing unit of the image display device uses the detection unit to detect that the display unit is mounted on a user's head or face. In response to the detection, an authentication screen is displayed on the captured image.
- the technique described in claim 5 of the present application is: A display unit used by being worn on the user's head or face; A detection unit for detecting whether or not the display unit is mounted on a user's head or face; and A shooting section for shooting the surroundings, An authentication processing unit for authenticating the user; A control unit for controlling an image displayed on the screen by the display unit; Comprising The control unit displays an authentication screen on the image captured by the imaging unit during the authentication process by the authentication processing unit.
- An image display device is: A display unit used by being worn on the user's head or face; A detection unit for detecting whether or not the display unit is mounted on a user's head or face; and A shooting section for shooting the surroundings, An authentication processing unit for authenticating the user; A control unit for controlling an image displayed on the screen by the display unit; Comprising The control unit displays an authentication screen on the image captured by the imaging unit during the authentication process by the authentication processing unit.
- An image display device is: A display unit used by being worn on the user's head or face; A detection unit for
- the authentication processing unit of the image display device according to claim 4 or 5 is configured to erase the authentication screen in response to completion of the authentication process. It is configured.
- the image display device further includes a content acquisition unit that acquires content.
- the control unit is configured to switch the display of the display unit from the captured image to a reproduced image of the content in response to occurrence of a predetermined event.
- the technique described in claim 8 of the present application is: A detection step of detecting whether or not the display unit used by being mounted on the user's head or face is mounted on the user's head or face; An image acquisition step of acquiring a captured image around the user; In response to detecting that the display unit is mounted on a user's head or face, a display step of displaying the captured image on the display unit; Is an image display method.
- the technique according to claim 9 of the present application is A detection step of detecting whether or not the display unit used by being mounted on the user's head or face is mounted on the user's head or face; An image acquisition step of acquiring a captured image around the user; An authentication process step of displaying an authentication screen on the captured image and authenticating the user; Is an image display method.
- the technique according to claim 10 of the present application is A detection unit for detecting whether or not a display unit used by being mounted on the user's head or face is mounted on the user's head or face; An image acquisition unit for acquiring a captured image around the user; In response to detecting that the display unit is mounted on a user's head or face, a display unit that displays the captured image on the display unit, As a computer program written in a computer-readable format to make the computer function.
- the technique according to claim 11 of the present application is A detection unit for detecting whether or not a display unit used by being mounted on the user's head or face is mounted on the user's head or face; An image acquisition unit for acquiring a captured image around the user; An authentication processing unit that displays an authentication screen on the captured image and authenticates the user; As a computer program written in a computer-readable format to make the computer function.
- the computer program according to claims 10 and 11 of the present application defines a computer program described in a computer-readable format so as to realize predetermined processing on a computer.
- a cooperative action is exhibited on the computer, which is the same as the image display method according to claims 8 and 9 of the present application, respectively. The effect of this can be obtained.
- an excellent image display device and an image display that can be worn on a head and used for viewing an image, and can perform authentication processing suitably through a screen when starting use.
- Methods and computer programs can be provided.
- the image display device displays a video see-through image as soon as it is mounted on the user's head or face, thereby avoiding the user from a dangerous state where the field of view is obstructed. be able to.
- an image display device to which the technology disclosed in this specification is applied can display an authentication screen superimposed on a video see-through image obtained by photographing a surrounding landscape with a built-in camera. Therefore, the user can perform the authentication process while observing the surrounding scenery with the video see-through image. Even with a head-mounted image display device that covers the user's eyes, the user can safely perform the authentication process while constantly checking the appearance of the outside world.
- FIG. 1 is a diagram showing a user wearing a head-mounted display 100 on the head as viewed from the front.
- FIG. 2 is a diagram illustrating a state in which a user wearing the head mounted display 100 illustrated in FIG. 1 is viewed from above.
- FIG. 3 is a diagram showing an example of the internal configuration of the head mounted display 100.
- FIG. 4 is a diagram illustrating an initial screen displayed on the head-mounted display 100 immediately after being worn by the user.
- FIG. 5 is a diagram illustrating a screen displayed when the head-mounted display 100 performs user authentication processing.
- FIG. 6 is a diagram showing a screen transition example of the head mounted display 100.
- FIG. 7 is a diagram showing a screen transition example of the head-mounted display 100 (when the authentication function is set).
- FIG. 1 shows a front view of a user wearing a head-mounted display 100 to which the technology disclosed in this specification is applied.
- the head-mounted display 100 directly covers the user's eyes when the user wears it on the head or face, and can give an immersive feeling to the user who is viewing the image. Further, unlike the see-through type, the user wearing the head mounted display 100 cannot directly view the real world scenery. However, by installing an outer camera 312 that captures a landscape in the direction of the user's line of sight and displaying the captured image, the user indirectly views the real world landscape (that is, displays the landscape with video see-through). be able to. Of course, a virtual display image such as an AR (Augmented Reality) image can be superimposed on the video see-through image. Further, since the display image cannot be seen from the outside (that is, another person), it is easy to protect privacy when displaying information.
- AR Augmented Reality
- a display panel (not shown in FIG. 1) for the user to observe is disposed at a position facing the left and right eyes inside the head mounted display 100 main body.
- the display panel includes a micro display such as an organic EL element or a liquid crystal display, or a laser scanning display such as a direct retina display.
- An outer camera 312 for inputting a surrounding image is installed in the approximate center of the front surface of the head mounted display 100 main body.
- microphones 103L and 103R are installed near the left and right ends of the head-mounted display 100 main body, respectively.
- a touch panel 315 that allows a user to perform touch input using a fingertip or the like is disposed outside the head mounted display 100 main body.
- a pair of left and right touch panels 315 are provided, but a single touch panel or three or more touch panels 315 may be provided.
- FIG. 2 shows a state in which the user wearing the head mounted display 100 shown in FIG. 1 is viewed from above.
- the illustrated head mounted display 100 has left-eye and right-eye display panels 104 ⁇ / b> L and 104 ⁇ / b> R on the side facing the user's face.
- the display panels 104L and 104R are configured by a laser scanning type display such as a micro display such as an organic EL element or a liquid crystal display or a retina direct drawing display.
- the display images on the display panels 104L and 104R are observed by the user as enlarged virtual images by passing through the virtual image optical units 101L and 101R.
- an eye width adjustment mechanism 105 is provided between the display panel for the right eye and the display panel for the left eye.
- FIG. 3 shows an internal configuration example of the head mounted display 100. Hereinafter, each part will be described.
- the control unit 301 includes a ROM (Read Only Memory) 301A and a RAM (Random Access Memory) 301B.
- the ROM 301A stores program codes executed by the control unit 301 and various data.
- the control unit 301 executes a program loaded into the RAM 301B, thereby starting image display control and overall operation of the head mounted display 100.
- examples of the program stored in the ROM 301A include an authentication processing program executed at the start of use, and a display control program for displaying a see-through image captured by the outer camera 312 and a playback moving image on the screen. Further, as data stored in the ROM 301A, identification information unique to the head mounted display 100, authentication information for authenticating a user who uses the head mounted display 100 (for example, a password, a password, biometric information) ) And other user attribute information.
- the input interface (IF) unit 302 includes one or more operators (none of which are shown) such as keys, buttons, switches, etc., on which the user performs input operations, accepts user instructions via the operators, and controls the control unit 301 is output. Further, the input operation unit 302 receives a user instruction including a remote control command received by the remote control reception unit 303 and outputs it to the control unit 301.
- operators such as keys, buttons, switches, etc.
- the input interface (IF) unit 302 when the user performs a touch operation with the fingertip on the touch panel 315 disposed outside the head-mounted display 100 main body, the coordinate data of the touched fingertip position, etc. are input to the control unit 301.
- the touch panel 315 is disposed on the back of the display image of the display unit 309 (enlarged virtual image observed through the virtual image optical unit 310) on the front surface of the main body of the head mounted display 100, the user can touch the display image with a fingertip. Touch operation can be performed as if touching.
- the status information acquisition unit 304 is a functional module that acquires status information of the head mounted display 100 main body or a user wearing the head mounted display 100.
- the state information acquisition unit 304 may be equipped with various sensors for detecting state information by itself, or an external device (for example, a user wears a part or all of these sensors).
- the status information may be acquired via a communication unit 305 (described later) from a smartphone, a wristwatch, or another multifunction terminal.
- the status information acquisition unit 304 acquires, for example, information on the position and posture of the user's head or information on the posture in order to track the user's head movement.
- the state information acquisition unit 304 is, for example, a sensor that can detect a total of nine axes including a three-axis gyro sensor, a three-axis acceleration sensor, and a three-axis geomagnetic sensor.
- the state information acquisition unit 304 may use any one or more sensors such as a GPS (Global Positioning System) sensor, a Doppler sensor, an infrared sensor, and a radio wave intensity sensor.
- the state information acquisition unit 304 acquires information provided from various infrastructures such as mobile phone base station information and PlaceEngine (registered trademark) information (electrical measurement information from a wireless LAN access point) for acquiring position and orientation information. Further, it may be used in combination.
- the state acquisition unit 304 for tracking the head movement is built in the head mounted display 100, but is configured with accessory parts attached to the head mounted display 100. It may be.
- the externally connected state acquisition unit 304 expresses the posture information of the head in the form of a rotation matrix, for example, wireless communication such as Bluetooth (registered trademark) communication, or USB (Universal Serial Bus).
- the data is transmitted to the head mounted display 100 main body via a high-speed wired interface.
- the state information acquisition unit 304 may include, for example, the user's work state (head mounted display) as the state information of the user wearing the head mounted display 100. 100 or not), user's action state (moving state such as stationary, walking, running, etc.) gesture by hand or fingertip, opening / closing state of eyelids, gaze direction, pupil size), mental state (user displays display image) The degree of excitement, excitement, wakefulness, emotion, emotion, etc.), and physiological state, such as whether they are immersed or concentrated during observation.
- the state information acquisition unit 304 acquires the state information from the user by using an outer camera 312, a wearing sensor such as a mechanical switch, an inner camera that captures the user's face, a gyro sensor, an acceleration sensor, Speed sensor, pressure sensor, temperature sensor that detects body temperature or air temperature, sweat sensor, pulse sensor, myoelectric sensor, electro-oculogram sensor, electroencephalogram sensor, exhalation sensor, gas / ion concentration sensor, etc. (Not shown) may also be provided.
- the state information acquisition unit 304 indicates that the head mounted display 100 has been mounted on the user in conjunction with the movement of contacting the user's forehead when the head mounted display is mounted on the user's head. It is also possible to detect whether or not the head-mounted display 100 is mounted by using a mounting sensor (for example, see Patent Document 8).
- the environment information acquisition unit 316 is a functional module that acquires information related to the environment surrounding the head mounted display 100 main body or the user wearing the head mounted display 100.
- Information on the environment here includes sound, air volume, temperature, atmospheric pressure, atmosphere (smoke, dense fog, electromagnetic waves (ultraviolet rays, blue light, radio waves), heat rays (infrared rays), radiation, and the like that the head mounted display 100 or the user receives.
- the environment sensor may include the above-described microphone and the outside camera 312.
- the environment information acquisition unit 316 may include an external device (for example, a part or all of these sensors).
- the environment information may be acquired via a communication unit 305 (described later) from a smartphone, a wristwatch, or other multi-function terminal worn by the user.
- the outer camera 312 is disposed, for example, in the approximate center of the front surface of the head mounted display 100 (see FIG. 2), and can capture a surrounding image. It is assumed that the user can adjust the zoom of the outer camera 312 through the operation of the input operation unit 302, the size of the pupil recognized by the inner camera, myoelectric potential sensor, or the like, or voice input. Further, by performing posture control in the pan, tilt, and roll directions of the outer camera 312 in accordance with the user's line-of-sight direction acquired by the state information acquisition unit 304, the image of the user's own eyes, that is, the user's line-of-sight direction, is obtained with the outer camera 312. Images can be taken. The captured image of the outer camera 312 can be displayed and output on the display unit 309, and the captured image can be transmitted from the communication unit 305 or stored in the storage unit 306.
- the outer camera 312 is composed of a plurality of cameras so that the three-dimensional information of the surrounding image can be acquired using the parallax information.
- shooting is performed while moving the camera using SLAM (Simultaneous Localization and Mapping) image recognition, and parallax information is calculated using a plurality of frame images that move in time (for example, (See Patent Document 9), and the three-dimensional information of the surrounding image can be acquired from the calculated parallax information.
- SLAM Simultaneous Localization and Mapping
- the outer camera 312 can acquire three-dimensional information, it can also be used as a distance sensor.
- a distance sensor made of an inexpensive device such as PSD (Position Sensitive Detector) that detects a reflection signal from an object may be used in combination with the outer camera 312.
- PSD Position Sensitive Detector
- the outer camera 312 and the distance sensor can be used to detect the body position, posture, and shape of the user wearing the head mounted display 100.
- the communication unit 305 performs communication processing with an external device (not shown), modulation / demodulation of communication signals, and encoding / decoding processing.
- the external device include a content playback device (Blu-ray disc or DVD player) that supplies viewing content when the user uses the head-mounted display 100, and a streaming server.
- the control unit 301 transmits transmission data to the external device from the communication unit 305.
- the configuration of the communication unit 305 is arbitrary.
- the communication unit 305 can be configured according to a communication method used for transmission / reception operations with an external device that is a communication partner.
- the communication method may be either wired or wireless.
- Communication standards mentioned here include MHL (Mobile High-definition Link), USB (Universal Serial Bus), HDMI (registered trademark) Multimedia Interface, Wi-Fi (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (trademark) Examples include BLE (Bluetooth (registered trademark) Low Energy) communication, ultra-low power wireless communication such as ANT, and a mesh network standardized by IEEE 802.11s.
- the communication unit 305 may be a cellular radio transceiver that operates according to a standard such
- the storage unit 306 is a large-capacity storage device configured by an SSD (Solid State Drive) or the like.
- the storage unit 306 stores application programs executed by the control unit 301 and various data. For example, the user uses the head mounted display 100 to store content to be viewed in the storage unit 306.
- the image processing unit 307 further performs signal processing such as image quality correction on the image signal output from the control unit 301 and converts the image signal to a resolution that matches the screen of the display unit 309.
- the display driving unit 308 sequentially selects the pixels of the display unit 309 for each row and performs line sequential scanning, and supplies a pixel signal based on the image signal subjected to signal processing.
- the display unit 309 includes a display panel including a micro display such as an organic EL (Electro-Luminescence) element or a liquid crystal display, or a laser scanning display such as a retina direct drawing display.
- the virtual image optical unit 310 enlarges and projects the display image of the display unit 309 and causes the user to observe it as an enlarged virtual image.
- display images output from the display unit 309 include commercial content (virtual world) supplied from a content playback device (Blu-ray disc or DVD player) or streaming server, images taken by the outer camera 312 (user view images, etc.) Real world images).
- a content playback device Blu-ray disc or DVD player
- streaming server images taken by the outer camera 312 (user view images, etc.) Real world images).
- the audio processing unit 313 further performs signal processing such as sound quality correction, audio amplification, and input audio signal on the audio signal output from the control unit 301. Then, the voice input / output unit 314 outputs the voice after voice processing to the outside and inputs voice from the microphone (described above).
- the head-mounted display 100 is light-shielding, that is, covers the eyes of the user who wears it.
- the display unit 309 displays commercial content such as a movie or an image expressed by computer graphics or the like.
- the virtual image optical unit 310 enlarges and projects the display image of the display unit 309, and causes the user to observe it as an enlarged virtual image having an appropriate angle of view, and reproduces a sense of reality such as watching in a movie theater, for example.
- the user can experience an immersion in the virtual world displayed on the display unit 309.
- the head mounted display 100 is driven by a battery (not shown) and is equipped with various power saving functions. For example, it is detected by the mounting sensor (described above) that the user has mounted the head mounted display 100, and an image is displayed on the display unit 309 only in the mounted state; Also, the display unit 309 is set to the screen non-display state to save power consumption.
- the head-mounted display 100 since the head-mounted display 100 according to the present embodiment is limited to a specific individual user or a specific user group, the wearer is equipped with an authentication function and performs an authentication process when starting use. It is assumed that the authentication function is set to be performed.
- the authentication processing method itself is not particularly limited. For example, input of authentication information (password or password) using the touch panel 315, authentication processing based on the movement of the user's gaze point (for example, see Patent Document 7), biometric information collected from the wearer Authentication processing based on this can be used. However, in any authentication method, an authentication screen (an authentication information input screen or an authentication screen indicating that authentication processing is being performed) is displayed on the display unit 309 during a period in which the authentication processing is performed. And
- the timing for starting to use the head mounted display 100 is not when the main power is turned on, but when the user detects that the head mounted display 100 is mounted with the mounting sensor (described above). . Therefore, even if the main power supply of the head mounted display 100 is turned on, the authentication process is not started unless the user is wearing the head or face. When it is detected that the user wears the head mounted display 100, authentication processing is started and an authentication screen is displayed.
- the user wears the light-shielding head-mounted display 100, both eyes are covered, the field of view is lost, and the user is blindfolded. Therefore, in this embodiment, when the user detects that the user mounts the head-mounted display 100 on the head or face with the mounting sensor (described above) and starts using it, as shown in FIG. As 401, the video see-through image captured by the outer camera 312 is immediately displayed on the display unit 309. Therefore, when the user mounts the head-mounted display 100, the user can immediately recover the field of view and observe the surrounding scenery. Thereafter, the screen may be changed to a display screen of commercial content (movies or the like) input from the outside in accordance with a user operation or the like.
- the wearing sensor detects that the user has mounted the head mounted display 100 on the head or face, and the use is started. Then, an authentication screen is displayed on the display unit 309. If the authentication process is delayed due to an error in the authentication operation, the authentication screen will not finish easily.
- the head-mounted display 100 displays an authentication screen on the video see-through image 501 captured by the outer camera 312 as shown in FIG. 5 while the authentication process is being performed. 502 is superimposed and displayed. Therefore, the user can observe the surrounding scenery with the video see-through image even during the authentication process.
- the user can always know his / her surroundings, so that safety can be ensured.
- the authentication screen disappears and only the video see-through image as the initial screen remains. Thereafter, the screen may be changed to a display screen of commercial content (movies or the like) input from the outside in accordance with a user operation or the like.
- the screen design of the authentication screen 502 is arbitrary depending on the authentication method to be used. By making the authentication screen 502 superimposed and displayed on the video see-through image transparent or translucent, the visibility of the surrounding scenery displayed by the video see-through display is improved.
- the outer camera 312 preferably captures an image that substantially matches the user's line of sight, or displays a video see-through image obtained by converting the captured image to match the user's field of view.
- the technology disclosed herein is not limited to video see-through images that match the user's field of view.
- FIG. 6 shows a screen transition example of the head mounted display 100.
- the head-mounted display 100 is changed from the non-wearing state 601 to the wearing state 602, triggered by the state information acquisition unit 304 having a wearing sensor or the like detecting that the user has attached the head-mounted display 100. Transition to.
- the display unit 309 transitions from the non-display screen 611 to the initial screen 612.
- a video see-through image 613 photographed by the outer camera 312 is displayed. Therefore, when the user mounts the head-mounted display 100, the user can immediately recover the field of view and observe the surrounding scenery. That is, since the user always knows the situation around him, safety can be ensured.
- the screen may be changed to a display screen 621 for externally input commercial content (such as a movie) according to a user operation on the input operation unit 302 or the like.
- a display screen 621 for externally input commercial content such as a movie
- the user mounts the head-mounted display 100 in this way, the video see-through image 613 is displayed and normal use of the head-mounted display 100 is started.
- the mounting sensor or the like detects that the camera has been removed, it returns to the non-mounted state 601.
- the same authentication process as described above is repeatedly executed.
- FIG. 7 shows a screen transition example of the head mounted display 100 when the authentication function is set.
- the head-mounted display 100 is changed from the non-wearing state 701 to the wearing state 702, triggered by the state information acquisition unit 304 having a wearing sensor or the like detecting that the user has attached the head-mounted display 100. Transition to.
- the display unit 309 transitions from the non-display screen 711 to the initial screen 712.
- the authentication processing content 714 is superimposed on the video see-through image 713 captured by the outer camera 312.
- the authentication processing content 714 here is an authentication information input screen or an authentication screen indicating that authentication processing is being performed. For example, therefore, even if the user mounts the head-mounted display 100 and the authentication process is started, the user can observe the surrounding scenery through the authentication process content 714. That is, since the user always knows the situation around him, safety can be ensured.
- the authentication process content 714 disappears and only the video see-through image 713 remains.
- the screen may be changed to a display screen 721 for commercial content (such as a movie) input from the outside in accordance with a user operation on the input operation unit 302 or the like.
- the video see-through image 713 is displayed as soon as the user wears the head-mounted display 100, and when the authentication process is successful, normal use of the head-mounted display 100 is started.
- the mounting sensor or the like detects that the head-mounted display 100 has been removed, the non-mounted state 701 is restored.
- the same authentication process as described above is repeatedly executed.
- the technology disclosed in the present specification has been described mainly with respect to an embodiment in which the technology disclosed in the present invention is applied to a video see-through type head-mounted display. It is not something.
- the technology disclosed in this specification can be similarly applied to various types of image display devices other than the head-mounted display that can observe the surrounding scenery by video see-through. .
- the technology disclosed in this specification can also be applied to various types of image display devices with cameras, such as mobile phones, smartphones, tablet terminals, and personal computers.
- the control unit displays the captured image from the non-display state in response to detection by the detection unit that the display unit is mounted on a user's head or face. Switch to The image display device according to (1) above.
- the image display device further comprising an authentication processing unit for authenticating the user;
- the image display device according to any one of (1) and (2).
- the authentication processing unit displays an authentication screen on the photographed image in response to the detection unit detecting that the display unit is mounted on a user's head or face.
- the image display device according to (3) above.
- Image display device .
- the authentication processing unit erases the authentication screen in response to the end of the authentication processing.
- the image display device according to any one of (4) and (5).
- a content acquisition unit that acquires content is further provided, The control unit switches the display of the display unit from the captured image to a reproduced image of the content in response to occurrence of a predetermined event.
- the image display device according to any one of (1) to (6) above.
- DESCRIPTION OF SYMBOLS 100 Head mount display 101L, 101R ... Virtual image optical part 103L, 103R ... Microphone, 104L, 104R ... Display panel 105 ... Eye width adjustment mechanism 301 ... Control part, 301A ... ROM, 301B ... RAM 302 ... Input operation unit 303 ... Remote control receiving unit 304 ... Status information acquisition unit 305 ... Communication unit 306 ... Storage unit 307 ... Image processing unit 308 ... Display drive unit 309 ... Display unit 310 ... Virtual image optical unit 312 ... Outside camera 313 ... Audio processing unit, 314 ... Audio input / output unit, 315: Touch panel, 316: Environmental information acquisition unit
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
ユーザーの頭部又は顔部に装着して用いられる表示部と、
前記表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部と、
周囲を撮影する撮影部と、
前記表示部で画面表示する画像を制御する制御部と、
を具備し、
前記制御部は、前記表示部がユーザーの頭部又は顔部に装着されたことを前記検出部により検出したことに応答して、前記表示部に前記撮影部による撮影画像を表示させる、
画像表示装置である。
ユーザーの頭部又は顔部に装着して用いられる表示部と、
前記表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部と、
周囲を撮影する撮影部と、
前記ユーザーを認証する認証処理部と、
前記表示部で画面表示する画像を制御する制御部と、
を具備し、
前記制御部は、前記認証処理部による認証処理中は、前記撮影部による撮影画像の上に認証画面を表示させる、
画像表示装置である。
ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出ステップと、
前記ユーザーの周囲の撮影画像を取得する画像取得ステップと、
前記表示部がユーザーの頭部又は顔部に装着されたことを検出したことに応答して、前記表示部に前記撮影画像を表示させる表示ステップと、
を有する画像表示方法である。
ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出ステップと、
前記ユーザーの周囲の撮影画像を取得する画像取得ステップと、
前記撮影画像の上に認証画面を表示させて、前記ユーザーを認証する認証処理ステップと、
を有する画像表示方法である。
ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部、
前記ユーザーの周囲の撮影画像を取得する画像取得部、
前記表示部がユーザーの頭部又は顔部に装着されたことを検出したことに応答して、前記表示部に前記撮影画像を表示させる表示部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラムである。
ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部、
前記ユーザーの周囲の撮影画像を取得する画像取得部、
前記撮影画像の上に認証画面を表示させて、前記ユーザーを認証する認証処理部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラムである。
(1)ユーザーの頭部又は顔部に装着して用いられる表示部と、
前記表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部と、
周囲を撮影する撮影部と、
前記表示部で画面表示する画像を制御する制御部と、
を具備し、
前記制御部は、前記表示部がユーザーの頭部又は顔部に装着されたことを前記検出部により検出したことに応答して、前記表示部に前記撮影部による撮影画像を表示させる、
画像表示装置。
(2)前記制御部は、前記表示部がユーザーの頭部又は顔部に装着されたことを前記検出部により検出したことに応答して、前記表示部を非表示状態から前記撮影画像の表示に切り換える、
上記(1)に記載の画像表示装置。
(3)前記ユーザーを認証する認証処理部をさらに備える、
上記(1)又は(2)のいずれかに記載の画像表示装置。
(4)前記認証処理部は、前記表示部がユーザーの頭部又は顔部に装着されたことを前記検出部により検出したことに応答して、前記撮影画像の上に認証画面を表示させる、
上記(3)に記載の画像表示装置。
(5)ユーザーの頭部又は顔部に装着して用いられる表示部と、
前記表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部と、
周囲を撮影する撮影部と、
前記ユーザーを認証する認証処理部と、
前記表示部で画面表示する画像を制御する制御部と、
を具備し、
前記制御部は、前記認証処理部による認証処理中は、前記撮影部による撮影画像の上に認証画面を表示させる、
画像表示装置。
(6)前記認証処理部は、認証処理が終了したことに応答して前記認証画面を消す、
上記(4)又は(5)のいずれかに記載の画像表示装置。
(7)コンテンツを取得するコンテンツ取得部をさらに備え、
前記制御部は、所定の事象の発生に応じて、前記表示部の表示を前記撮影画像からコンテンツの再生画像に切り換える、
上記(1)乃至(6)のいずれかに記載の画像表示装置。
(8)ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出ステップと、
前記ユーザーの周囲の撮影画像を取得する画像取得ステップと、
前記表示部がユーザーの頭部又は顔部に装着されたことを検出したことに応答して、前記表示部に前記撮影画像を表示させる表示ステップと、
を有する画像表示方法。
(9)ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出ステップと、
前記ユーザーの周囲の撮影画像を取得する画像取得ステップと、
前記撮影画像の上に認証画面を表示させて、前記ユーザーを認証する認証処理ステップと、
を有する画像表示方法。
(10)ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部、
前記ユーザーの周囲の撮影画像を取得する画像取得部、
前記表示部がユーザーの頭部又は顔部に装着されたことを検出したことに応答して、前記表示部に前記撮影画像を表示させる表示部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラム。
(11)ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部、
前記ユーザーの周囲の撮影画像を取得する画像取得部、
前記撮影画像の上に認証画面を表示させて、前記ユーザーを認証する認証処理部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラム。
101L、101R…虚像光学部
103L、103R…マイクロフォン、104L、104R…表示パネル
105…眼幅調整機構
301…制御部、301A…ROM、301B…RAM
302…入力操作部、303…リモコン受信部
304…状態情報取得部、305…通信部、306…記憶部
307…画像処理部、308…表示駆動部
309…表示部、310…虚像光学部、312…外側カメラ
313…音声処理部、314…音声入出力部、
315…タッチパネル、316…環境情報取得部
Claims (11)
- ユーザーの頭部又は顔部に装着して用いられる表示部と、
前記表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部と、
周囲を撮影する撮影部と、
前記表示部で画面表示する画像を制御する制御部と、
を具備し、
前記制御部は、前記表示部がユーザーの頭部又は顔部に装着されたことを前記検出部により検出したことに応答して、前記表示部に前記撮影部による撮影画像を表示させる、
画像表示装置。 - 前記制御部は、前記表示部がユーザーの頭部又は顔部に装着されたことを前記検出部により検出したことに応答して、前記表示部を非表示状態から前記撮影画像の表示に切り換える、
請求項1に記載の画像表示装置。 - 前記ユーザーを認証する認証処理部をさらに備える、
請求項1又は2のいずれかに記載の画像表示装置。 - 前記認証処理部は、前記表示部がユーザーの頭部又は顔部に装着されたことを前記検出部により検出したことに応答して、前記撮影画像の上に認証画面を表示させる、
請求項3に記載の画像表示装置。 - ユーザーの頭部又は顔部に装着して用いられる表示部と、
前記表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部と、
周囲を撮影する撮影部と、
前記ユーザーを認証する認証処理部と、
前記表示部で画面表示する画像を制御する制御部と、
を具備し、
前記制御部は、前記認証処理部による認証処理中は、前記撮影部による撮影画像の上に認証画面を表示させる、
画像表示装置。 - 前記認証処理部は、認証処理が終了したことに応答して前記認証画面を消す、
請求項4又は5のいずれかに記載の画像表示装置。 - コンテンツを取得するコンテンツ取得部をさらに備え、
前記制御部は、所定の事象の発生に応じて、前記表示部の表示を前記撮影画像からコンテンツの再生画像に切り換える、
請求項1乃至6のいずれかに記載の画像表示装置。 - ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出ステップと、
前記ユーザーの周囲の撮影画像を取得する画像取得ステップと、
前記表示部がユーザーの頭部又は顔部に装着されたことを検出したことに応答して、前記表示部に前記撮影画像を表示させる表示ステップと、
を有する画像表示方法。 - ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出ステップと、
前記ユーザーの周囲の撮影画像を取得する画像取得ステップと、
前記撮影画像の上に認証画面を表示させて、前記ユーザーを認証する認証処理ステップと、
を有する画像表示方法。 - ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部、
前記ユーザーの周囲の撮影画像を取得する画像取得部、
前記表示部がユーザーの頭部又は顔部に装着されたことを検出したことに応答して、前記表示部に前記撮影画像を表示させる表示部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラム。 - ユーザーの頭部又は顔部に装着して用いられる表示部がユーザーの頭部又は顔部に装着されたか否かを検出する検出部、
前記ユーザーの周囲の撮影画像を取得する画像取得部、
前記撮影画像の上に認証画面を表示させて、前記ユーザーを認証する認証処理部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/325,308 US20170186236A1 (en) | 2014-07-22 | 2015-04-30 | Image display device, image display method, and computer program |
JP2016535819A JPWO2016013269A1 (ja) | 2014-07-22 | 2015-04-30 | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014149210 | 2014-07-22 | ||
JP2014-149210 | 2014-07-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016013269A1 true WO2016013269A1 (ja) | 2016-01-28 |
Family
ID=55162808
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/062929 WO2016013269A1 (ja) | 2014-07-22 | 2015-04-30 | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20170186236A1 (ja) |
JP (1) | JPWO2016013269A1 (ja) |
WO (1) | WO2016013269A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018101162A1 (ja) * | 2016-12-01 | 2018-06-07 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイ、表示制御装置、表示制御方法及びプログラム |
JP2019525510A (ja) * | 2016-05-25 | 2019-09-05 | チンタオ ゴーアテック テクノロジー カンパニー リミテッドQingdao Goertek Technology Co., Ltd. | 仮想現実ヘルメット及びその使用方法 |
JP2020021346A (ja) * | 2018-08-02 | 2020-02-06 | 富士通株式会社 | 表示制御プログラム、表示制御方法、情報処理装置およびヘッドマウントユニット |
JP2020034646A (ja) * | 2018-08-28 | 2020-03-05 | 株式会社Nttドコモ | ウェアラブル端末及び表示システム |
JPWO2021156939A1 (ja) * | 2020-02-04 | 2021-08-12 |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3485425B1 (en) * | 2016-07-14 | 2023-08-23 | Magic Leap, Inc. | Deep neural network for iris identification |
AU2017361061B2 (en) | 2016-11-15 | 2022-02-03 | Magic Leap, Inc. | Deep learning system for cuboid detection |
KR102656425B1 (ko) * | 2016-12-07 | 2024-04-12 | 삼성전자주식회사 | 영상을 표시하는 전자 장치 및 방법 |
CN106682468A (zh) * | 2016-12-30 | 2017-05-17 | 百度在线网络技术(北京)有限公司 | 解锁电子设备的方法以及电子设备 |
CN111033524A (zh) | 2017-09-20 | 2020-04-17 | 奇跃公司 | 用于眼睛跟踪的个性化神经网络 |
JP7181928B2 (ja) | 2017-10-26 | 2022-12-01 | マジック リープ, インコーポレイテッド | 深層マルチタスクネットワークにおける適応的損失平衡のための勾配正規化システムおよび方法 |
US10776475B2 (en) * | 2017-11-13 | 2020-09-15 | International Business Machines Corporation | Secure password input in electronic devices |
KR20190089627A (ko) * | 2018-01-23 | 2019-07-31 | 삼성전자주식회사 | Ar 서비스를 제공하는 디바이스 및 그 동작 방법 |
US20190289396A1 (en) * | 2018-03-15 | 2019-09-19 | Microsoft Technology Licensing, Llc | Electronic device for spatial output |
US10674305B2 (en) | 2018-03-15 | 2020-06-02 | Microsoft Technology Licensing, Llc | Remote multi-dimensional audio |
US10520739B1 (en) * | 2018-07-11 | 2019-12-31 | Valve Corporation | Dynamic panel masking |
JP7432339B2 (ja) * | 2019-10-29 | 2024-02-16 | 株式会社日立エルジーデータストレージ | ヘッドマウントディスプレイ |
US20210365533A1 (en) * | 2020-05-20 | 2021-11-25 | Facebook Technologies, Llc | Systems and methods for authenticating a user of a head-mounted display |
GB2628281A (en) * | 2022-01-07 | 2024-09-18 | Jumio Corp | Biometric authentication using head-mounted devices |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004258123A (ja) * | 2003-02-24 | 2004-09-16 | Canon Inc | 表示制御方法、及び表示制御装置 |
JP2006205930A (ja) * | 2005-01-28 | 2006-08-10 | Konica Minolta Photo Imaging Inc | 映像表示装置 |
JP2008257671A (ja) * | 2007-03-15 | 2008-10-23 | Canon Inc | 情報処理装置及び情報処理方法 |
JP2012053629A (ja) * | 2010-08-31 | 2012-03-15 | Canon Inc | 画像生成装置、画像生成方法、及びプログラム |
JP2012212023A (ja) * | 2011-03-31 | 2012-11-01 | Brother Ind Ltd | ヘッドマウントディスプレイ,及びその輝度調整方法 |
JP2014123883A (ja) * | 2012-12-21 | 2014-07-03 | Nikon Corp | 頭部装着型情報入出力装置、及び頭部装着型情報入出力方法 |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6124976A (en) * | 1998-03-17 | 2000-09-26 | Sony Corporation | Voltage controlling method for head mounted display unit and head mounted display apparatus |
KR100656342B1 (ko) * | 2004-12-16 | 2006-12-11 | 한국전자통신연구원 | 다중 입체 영상 혼합 제시용 시각 인터페이스 장치 |
KR100809479B1 (ko) * | 2006-07-27 | 2008-03-03 | 한국전자통신연구원 | 혼합 현실 환경을 위한 얼굴 착용형 디스플레이 장치 |
US7855743B2 (en) * | 2006-09-08 | 2010-12-21 | Sony Corporation | Image capturing and displaying apparatus and image capturing and displaying method |
WO2010082496A1 (ja) * | 2009-01-19 | 2010-07-22 | パナソニック株式会社 | 脳波インタフェースシステムのための起動装置、方法およびコンピュータプログラム |
US9557812B2 (en) * | 2010-07-23 | 2017-01-31 | Gregory A. Maltz | Eye gaze user interface and calibration method |
JP2012186660A (ja) * | 2011-03-06 | 2012-09-27 | Sony Corp | ヘッド・マウント・ディスプレイ |
US9081416B2 (en) * | 2011-03-24 | 2015-07-14 | Seiko Epson Corporation | Device, head mounted display, control method of device and control method of head mounted display |
US8184067B1 (en) * | 2011-07-20 | 2012-05-22 | Google Inc. | Nose bridge sensor |
US8929589B2 (en) * | 2011-11-07 | 2015-01-06 | Eyefluence, Inc. | Systems and methods for high-resolution gaze tracking |
US8878749B1 (en) * | 2012-01-06 | 2014-11-04 | Google Inc. | Systems and methods for position estimation |
US20150084864A1 (en) * | 2012-01-09 | 2015-03-26 | Google Inc. | Input Method |
JP5929567B2 (ja) * | 2012-07-03 | 2016-06-08 | ソニー株式会社 | 画像信号処理装置、および画像信号処理方法、並びにプログラム |
WO2014021602A2 (ko) * | 2012-07-31 | 2014-02-06 | 인텔렉추얼디스커버리 주식회사 | 착용형 전자 장치 및 그의 제어 방법 |
WO2014041872A1 (ja) * | 2012-09-12 | 2014-03-20 | ソニー株式会社 | 画像表示装置 |
US9720231B2 (en) * | 2012-09-26 | 2017-08-01 | Dolby Laboratories Licensing Corporation | Display, imaging system and controller for eyewear display device |
US9019174B2 (en) * | 2012-10-31 | 2015-04-28 | Microsoft Technology Licensing, Llc | Wearable emotion detection and feedback system |
US20170061647A1 (en) * | 2012-11-02 | 2017-03-02 | Thad Eugene Starner | Biometric Based Authentication for Head-Mountable Displays |
US9542958B2 (en) * | 2012-12-18 | 2017-01-10 | Seiko Epson Corporation | Display device, head-mount type display device, method of controlling display device, and method of controlling head-mount type display device |
JP6155643B2 (ja) * | 2013-01-07 | 2017-07-05 | セイコーエプソン株式会社 | 表示装置、および、表示装置の制御方法 |
JP6160154B2 (ja) * | 2013-03-22 | 2017-07-12 | セイコーエプソン株式会社 | 頭部装着型表示装置を利用した情報表示システム、頭部装着型表示装置を利用した情報表示方法、および、頭部装着型表示装置 |
US9979547B2 (en) * | 2013-05-08 | 2018-05-22 | Google Llc | Password management |
US9251333B2 (en) * | 2013-08-29 | 2016-02-02 | Paypal, Inc. | Wearable user device authentication system |
US10277787B2 (en) * | 2013-09-03 | 2019-04-30 | Tobii Ab | Portable eye tracking device |
US8958158B1 (en) * | 2013-12-03 | 2015-02-17 | Google Inc. | On-head detection for head-mounted display |
US9285872B1 (en) * | 2013-12-12 | 2016-03-15 | Google Inc. | Using head gesture and eye position to wake a head mounted device |
US20150312468A1 (en) * | 2014-04-23 | 2015-10-29 | Narvaro Inc. | Multi-camera system controlled by head rotation |
KR102227087B1 (ko) * | 2014-07-08 | 2021-03-12 | 엘지전자 주식회사 | 글래스 타입 단말기 및 그것의 제어방법 |
-
2015
- 2015-04-30 JP JP2016535819A patent/JPWO2016013269A1/ja active Pending
- 2015-04-30 US US15/325,308 patent/US20170186236A1/en not_active Abandoned
- 2015-04-30 WO PCT/JP2015/062929 patent/WO2016013269A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004258123A (ja) * | 2003-02-24 | 2004-09-16 | Canon Inc | 表示制御方法、及び表示制御装置 |
JP2006205930A (ja) * | 2005-01-28 | 2006-08-10 | Konica Minolta Photo Imaging Inc | 映像表示装置 |
JP2008257671A (ja) * | 2007-03-15 | 2008-10-23 | Canon Inc | 情報処理装置及び情報処理方法 |
JP2012053629A (ja) * | 2010-08-31 | 2012-03-15 | Canon Inc | 画像生成装置、画像生成方法、及びプログラム |
JP2012212023A (ja) * | 2011-03-31 | 2012-11-01 | Brother Ind Ltd | ヘッドマウントディスプレイ,及びその輝度調整方法 |
JP2014123883A (ja) * | 2012-12-21 | 2014-07-03 | Nikon Corp | 頭部装着型情報入出力装置、及び頭部装着型情報入出力方法 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10841567B2 (en) | 2016-05-25 | 2020-11-17 | Qingdao Goertek Technology Co., Ltd. | Virtual reality helmet and method for using same |
JP2019525510A (ja) * | 2016-05-25 | 2019-09-05 | チンタオ ゴーアテック テクノロジー カンパニー リミテッドQingdao Goertek Technology Co., Ltd. | 仮想現実ヘルメット及びその使用方法 |
JPWO2018101162A1 (ja) * | 2016-12-01 | 2019-06-24 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイ、表示制御装置、表示制御方法及びプログラム |
WO2018101162A1 (ja) * | 2016-12-01 | 2018-06-07 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイ、表示制御装置、表示制御方法及びプログラム |
US10965884B2 (en) | 2016-12-01 | 2021-03-30 | Sony Interactive Entertainment Inc. | Head-mounted display, display controlling apparatus, display controlling method, and program |
JP2020021346A (ja) * | 2018-08-02 | 2020-02-06 | 富士通株式会社 | 表示制御プログラム、表示制御方法、情報処理装置およびヘッドマウントユニット |
JP7103040B2 (ja) | 2018-08-02 | 2022-07-20 | 富士通株式会社 | 表示制御プログラム、表示制御方法、情報処理装置およびヘッドマウントユニット |
JP2020034646A (ja) * | 2018-08-28 | 2020-03-05 | 株式会社Nttドコモ | ウェアラブル端末及び表示システム |
JP7140602B2 (ja) | 2018-08-28 | 2022-09-21 | 株式会社Nttドコモ | ウェアラブル端末及び表示システム |
JPWO2021156939A1 (ja) * | 2020-02-04 | 2021-08-12 | ||
WO2021156939A1 (ja) * | 2020-02-04 | 2021-08-12 | マクセル株式会社 | ヘッドマウントディスプレイ装置 |
US11971550B2 (en) | 2020-02-04 | 2024-04-30 | Maxell, Ltd. | Head-mounted display device |
JP7487236B2 (ja) | 2020-02-04 | 2024-05-20 | マクセル株式会社 | ヘッドマウントディスプレイ装置 |
US12235457B2 (en) | 2020-02-04 | 2025-02-25 | Maxell, Ltd. | Head-mounted display device |
Also Published As
Publication number | Publication date |
---|---|
US20170186236A1 (en) | 2017-06-29 |
JPWO2016013269A1 (ja) | 2017-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016013269A1 (ja) | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム | |
JP6525010B2 (ja) | 情報処理装置及び情報処理方法、並びに画像表示システム | |
KR102332326B1 (ko) | 화상 표시 장치 및 화상 표시 방법 | |
JP6079614B2 (ja) | 画像表示装置及び画像表示方法 | |
KR102316327B1 (ko) | 가상 공간의 캡쳐 방법 및 그 전자장치 | |
KR102544062B1 (ko) | 가상 이미지 표시 방법, 저장 매체 및 이를 위한 전자 장치 | |
JP6217747B2 (ja) | 情報処理装置及び情報処理方法 | |
KR102233223B1 (ko) | 화상 표시 장치 및 화상 표시 방법, 화상 출력 장치 및 화상 출력 방법과, 화상 표시 시스템 | |
JP6340301B2 (ja) | ヘッドマウントディスプレイ、携帯情報端末、画像処理装置、表示制御プログラム、表示制御方法、及び表示システム | |
US20150355463A1 (en) | Image display apparatus, image display method, and image display system | |
US20140126782A1 (en) | Image display apparatus, image display method, and computer program | |
KR20160128119A (ko) | 이동 단말기 및 이의 제어방법 | |
US20240353922A1 (en) | Devices, methods, and graphical user interfaces for user enrollment and authentication | |
US20240020371A1 (en) | Devices, methods, and graphical user interfaces for user authentication and device management | |
US20200348749A1 (en) | Information processing apparatus, information processing method, and program | |
US20240370542A1 (en) | Devices, methods, and graphical user interfaces for transitioning between multiple modes of operation | |
US12277848B2 (en) | Devices, methods, and graphical user interfaces for device position adjustment | |
WO2024233110A1 (en) | Devices, methods, and graphical user interfaces for transitioning between multiple modes of operation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15824360 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016535819 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15325308 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15824360 Country of ref document: EP Kind code of ref document: A1 |