US20180365900A1 - Mixed Reality Head Mounted Display Device - Google Patents
Mixed Reality Head Mounted Display Device Download PDFInfo
- Publication number
- US20180365900A1 US20180365900A1 US15/628,560 US201715628560A US2018365900A1 US 20180365900 A1 US20180365900 A1 US 20180365900A1 US 201715628560 A US201715628560 A US 201715628560A US 2018365900 A1 US2018365900 A1 US 2018365900A1
- Authority
- US
- United States
- Prior art keywords
- neck
- user
- unit
- display
- face
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012545 processing Methods 0.000 claims description 23
- 238000000034 method Methods 0.000 claims description 8
- 238000001816 cooling Methods 0.000 claims description 5
- 230000003287 optical effect Effects 0.000 claims description 5
- 238000005259 measurement Methods 0.000 claims description 3
- 230000002093 peripheral effect Effects 0.000 claims description 3
- 230000004927 fusion Effects 0.000 claims 1
- 230000003190 augmentative effect Effects 0.000 abstract description 3
- 210000003128 head Anatomy 0.000 description 17
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 230000001953 sensory effect Effects 0.000 description 5
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 3
- 210000003625 skull Anatomy 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 210000003109 clavicle Anatomy 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 125000001475 halogen functional group Chemical group 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000002304 perfume Substances 0.000 description 1
- 230000003362 replicative effect Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1675—Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
- G06F1/1679—Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for locking or maintaining the movable parts of the enclosure in a fixed position, e.g. latching mechanism at the edge of the display in a laptop or for the screen protective cover of a PDA
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0176—Head mounted characterised by mechanical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1635—Details related to the integration of battery packs and other power supplies such as fuel cells or integrated AC adapter
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1654—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being detachable, e.g. for remote use
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1675—Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
- G06F1/1681—Details related solely to hinges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0149—Head-up displays characterised by mechanical features
- G02B2027/0154—Head-up displays characterised by mechanical features with movable elements
- G02B2027/0156—Head-up displays characterised by mechanical features with movable elements with optionally usable elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Definitions
- components such as the display 38 , the binocular optical module 40 , and the speakers 56 provide output to the user.
- Components such as the camera 42 , the proximity sensor 44 , the radar 52 , or the LIDAR 54 are used to provide input to the HMD 20 that is environmentally based (i.e., not directly provided by the user). While these environmentally based input sensors may also be positioned on the neck-mounted unit 24 , positioning the sensors on the user's face enable them to capture the environment within the line of sight of the user despite the orientation of the user's body.
- the IMU 48 includes matching components in both the face-mounted unit 22 and the neck-mounted unit 24 that make use of magnetic fields to determine their relation in positioning to one another. This provides high-performance detection for user had positioning and orientation.
- subwoofers 72 or other larger speakers that are often difficult to include on a compact, face mounted device.
- the neck-mounted unit provides more surface area and an internal volume (to resonate sound waves) which in turn makes the use of subwoofers more feasible.
- FIG. 9 is a block diagram illustrating a close up of an electrical connection within the hinge mechanism 88 .
- HMD components are located external to the visor 84 . In order to accommodate those components with a wired connection there needs to be electrical conductivity through the hinge 88 . There are a number of means of transmitting on digital signals and power through a hinge.
- FIG. 10 is a flowchart illustrating a method for communicating between a face-mounted unit and a neck mounted-unit.
- the connection between the two may be wired or wireless, and may pass either data, power, or both data and power. Additionally, if only one of the units includes a power source, power must be shared between the units.
- the HMD Displays a mixed reality experience on a near-eye display mounted on a head of a user to the user.
- a neck-mounted unit transmits power and electrical signals to the near-eye display via a connection, wherein the neck-mounted unit includes a battery and a processor.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Power Engineering (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Disclosed are a Head mounted Display (HMD) device for use in mixed reality applications such as virtual or augmented reality. The disclosed HMD includes improvement to comfort and convenience of wearing. Convenience features include shifting portions of the HMD's heat generating and heavier components off the user's head and on to their neck, or neck and shoulders. Shifting components to the user's neck opens additional input schemes and enables the use of further feedback devices that improve the immersiveness of the mixed reality experience. An additional convenience feature is an adjustable visor that lifts up and down such that the user may view the real-world or the virtual-world without having to remove the HMD fully from their heads.
Description
- This disclosure relates to head mounted display devices and more particularly to the physical structure thereof.
- Virtual reality (VR) and augmented reality (AR) visualization systems are starting to enter the mainstream consumer electronics marketplace. These devices are often bulky and limit the ability of the user to move comfortably or see.
- One or more embodiments of the present disclosure are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements.
-
FIG. 1 illustrates an example of an environment including an HMD device. -
FIG. 2 illustrates a first embodiment of a hybrid, face and neck display device. -
FIG. 3 illustrates a second embodiment of a hybrid, face and neck display device. -
FIG. 4A illustrates a hybrid HMD device in a necklace configuration. -
FIG. 4B illustrates a hybrid HMD device in a necklace and pendant configuration. -
FIG. 4C illustrates a hybrid HMD device in a mantle configuration. -
FIG. 5 is a block diagram including a first embodiment of component positioning in a hybrid HMD. -
FIG. 6 is a block diagram including a second embodiment of component positioning in a hybrid HMD. -
FIG. 7 illustrates an adjustable near-eye display visor in a number of configurations. -
FIG. 8 illustrates a hinge mechanism for an adjustable near-eye display device. -
FIG. 9 is a block diagram illustrating a close up of an electrical connection within the hinge mechanism. -
FIG. 10 is a flowchart illustrating a method for communicating between a face-mounted unit and a neck mounted-unit. - In this description, references to “an embodiment,” “one embodiment” or the like, mean that the particular feature, function, structure or characteristic being described is included in at least one embodiment of the technique introduced here. Occurrences of such phrases in this specification do not necessarily all refer to the same embodiment. On the other hand, the embodiments referred to also are not necessarily mutually exclusive.
- Head mounted displays (HMD) are often used for virtual reality or augmented reality applications, often inclusively referred to as mixed reality. An engineering concern HMDs is often the ease of wearing. HMD's may be physically and visually restrictive, heavy, awkward, bulky hot, and disorienting. The present disclosure includes improvements upon these issues.
- One way to improve upon mobility and comfort is to remove components from the user's head. This is sometimes done with backpacks or connections to external computers. However, these methods make it more difficult to market, sell all-in-one HMD devices, and cause the user to be tethered to external devices. The primary issue is removing weight, bulk, and heat from the user's head or face area. To achieve that, it is unnecessary to go so far as placing it on their back. Instead, the components that do not necessarily need to be on the user's face may be moved to their neck, and worn as a necklace.
- Another issue is visibility. This issue is more particularly noticeable in immersive virtual reality applications. Current designs must be taken all the way off in order for a user to see the real world again. Thus having a display that can be adjusted to either be in front of the user's face (in use) or propped up on their forehead as if the HMD were a pair of glasses enables the user to view the real world again without having to concern themselves with tightness settings.
-
FIG. 1 shows an example of an environment including a HMD device 1 that can implement the techniques introduced here. In the illustrated example, theHMD device 10 is configured to communicate data to and from aprocessing system 12 through aconnection 14, which can be a wired connection, a wireless connection, or a combination thereof. In some use cases, theHMD device 10 may operate as a standalone device with an integratedprocessing system 12. In some use cases, theprocessing system 12 is external to theHMD 10. - The
connection 14 can be configured to carry any kind of data, such as image data (e.g., still images and/or full-motion video, including 2D and 3D images), audio data (including voice), multimedia, and/or any other type(s) of data. Theprocessing device 12 may be, for example, a game console, personal computer, tablet computer, smartphone, or other type of processing device. Theconnection 14 can be, for example, a universal serial bus (USB) connection, Wi-Fi connection, Bluetooth or Bluetooth Low Energy (BLE) connection, Ethernet connection, cable connection, DSL connection, cellular connection (e.g., 3G, LTE/4G or 5G), or the like, or a combination thereof. Additionally, theprocessing device 12 may communicate with one or more other processing systems 5 via a network 4, which may be or include, for example, a local area network (LAN), a wide area network (WAN), an intranet, a metropolitan area network (MAN), the global Internet, or a combination thereof. - The
processing system 12 may further connect to anetwork 16, such as the Internet, a local area network (LAN), or a virtual private network (VPN). Through thenetwork 16, theHMD device 10 may make use of asecondary processing systems 18. -
FIG. 2 illustrates a first embodiment of a hybrid, face and neck display device 20 (hybrid device). The hybrid device has a face-mountedunit 22 and a neck-mountedunit 24. The face-mountedunit 22 includes components that are sensory and control related (e.g., display, motion sensors, etc. . . . ). Conversely, the neck-mountedunit 24 includes the bulkiest components (e.g., the batteries). Some components are neither bulky/heavy but are also not relevant to sensory and controls (e.g., processors such as the CPU/GPU). These components may be places in eitherunit unit 24. - The
hybrid device 20 includes some additional enabling components. One such enabling component is a means for mounting the face-mountedunit 22 to the user's head/face. Such means is displayed inFIG. 2 viastraps 26. Other thanstraps 26, the means for mounting may include a headband, a halo, clamps/clips, or a hat/helmet. The face and neck mountedunits FIG. 2 , the connection is acable 28. The cable carries both power and data between theunits - The neck-mounted
unit 24 inFIG. 2 is in a necklace configuration including apendant 30. The necklace configuration may be wrapped, slid, or clamped around the user's neck.FIG. 2 also includes a face-unit mountedcamera 32. The face-mountedunit 22 may include a number of sensors. - An advantage of the displayed embodiment is that the face-mounted unit is lighter and thus will fit more comfortably on the user's face. This enables the user to be more mobile during use. The neck-mounted
unit 24 is significantly lighter than a backpack, but still enables weight to be removed from the face-mountedunit 22. Additionally, a more robust cooling system may be employed on the neck-mountedunit 24 than would otherwise be comfortable on a face-mountedunit 22. Vibrations created by a fan are more noticeable and irritating as felt through the skull than on the clavicle. Vibrations received through the skull are often audible (e.g., music played through a metal rod that is clenched in the teeth can be heard). Conversely, a user cannot hear weak vibrations through the clavicle. -
FIG. 3 illustrates a second embodiment of a hybrid, face andneck display device 20A. The figure is similar to the embodiment displayed inFIG. 2 . Displayed is analternate embodiment 20A of the means of mounting the face-mountedunit 22. - The “4” Figure series illustrates a number of embodiments for the neck-mounted
unit 24. -
FIG. 4A illustrates a hybrid HMD device in anecklace configuration 34. Thenecklace configuration 34 wraps around the user's neck much like a necklace. In some embodiments, thenecklace configuration 34 completely encircles the user's neck, once clasped. In other embodiments, the necklace configuration wraps partially around the user's neck. There are a number of ways of securing thenecklace configuration 34 on the user including clasps, force fit, magnets, a hinge, or other methods known in adorning neck mounted wearables. - The
necklace configuration 34 may position components within a hollow volume of the neck-mounted unit. The exact positioning of each component within the hollow volume may vary. In some embodiments, the components within the hollow volume are positioned to balance weight across the entire neck-mounted unit. -
FIG. 4B illustrates a hybrid HMD device in a necklace and pendant configuration. The neck-mountedunit 24 is shown with anecklace configuration 34 and apendant 30. Thependant 30 is primarily used for component storage. In this configuration the bulk of the weight of the neck mountedunit 24 is within thependant 30. Thependant 30 may be constructed a number of ways including a fixed position in front of the user, a dangling position from the necklace 34 (including wired communication there between), multiple pendants, or other methods known in the art of adorning worn articles around the neck. - The pendant itself does not necessarily have to be positioned on the front, center of the user's neck. The weight of the components may be distributed at the back, on the sides, or evenly on either side.
-
FIG. 4C illustrates a hybrid HMD device in amantle configuration 36. Themantle configuration 36 is notably more robust than either thenecklace configuration 34, or the necklace andpendant configuration 30. Themantle configuration 36 expands across, and derives support from the wearer's shoulders. Use of a mantle shaped adornment provide the digital space in a hollow volume in which to position components, while still retaining a smaller profile that a backpack. - While other configurations may include the same additional components, the
mantle configuration 36 is more readily configured for haptic feedback, jets of air or water used for improving an immersive experience, and speakers including bass tones or subwoofers. While it might not be preferable to feel vibration from a fan on one's skull (thereby “hearing” the fan), experiencing the pulsing of a subwoofer on one's shoulders can improve an immersive experience. The additional immersive experience components may be positioned on one or both shoulders of themantle configuration 36. -
FIG. 5 is a block diagram including a first embodiment of component positioning in ahybrid HMD 20 between the face-mountedunit 22 and the neck-mountedunit 24. The overall scheme in component positioning, with some exceptions, is to remove all components that do not need to be on the user's face, from the user's face. One of the primary components in an HMD is thedisplay 38. Thedisplay 38, in operation, needs to be in front of the user's eyes, and therefore must remain on the face-mountedunit 22. In some embodiments, thedisplay 38 is a portion of the mobile device such as a cell phone or tablet. In such embodiments, the face-mountedunit 22 comprises a housing where the mobile device may be inserted into the housing during use. Additionally, the face-mountedunit 22 may include a suite of sensors. The face-mountedunit 22 uses the sensors to either accept input or render output for user. - Sensors in the sensor suite include a binocular
optical module 40, afront facing camera 42, theproximity sensor 44, an eye-gaze sensor 46, and inertial measurement unit (IMU) 48 or accelerometer, amicrophone 50, aradar sensor 52, aLIDAR sensor 54,speakers 56, or other sensory equipment known in the art. The binocularoptical module 40 refers to a set of lenses that adapt a display to be suited for view separately in each of two eyes, and very close up. - While these components may be used for multiple purposes, components such as the
display 38, the binocularoptical module 40, and thespeakers 56 provide output to the user. Components such as thecamera 42, theproximity sensor 44, theradar 52, or theLIDAR 54 are used to provide input to theHMD 20 that is environmentally based (i.e., not directly provided by the user). While these environmentally based input sensors may also be positioned on the neck-mountedunit 24, positioning the sensors on the user's face enable them to capture the environment within the line of sight of the user despite the orientation of the user's body. - Finally, components such as the eye-
gaze sensor 46, theIMU 48, and themicrophone 50 each collect direct user input. Theeye gaze sensor 46 must be positioned on the face-mountedunit 22 and within proximity of the user's eyes in order to detect where the user's eyes are looking on the display. Themicrophone 50 may be positioned on the neck-mountedunit 24; however, placing the microphone closer to the user's mouth (i.e., their face) improves microphone performance. - The
IMU 48 is positioned on the face-mountedunit 22 in order to detect the motion of the user's head. One advantage of moving heavy components from the user's face to their neck is that movement of the user's head will feel more natural. Detecting the movement of the user's head is important element in maintaining an immersive experience. The neck mountedunit 24 may also includeIMU 48. Such inclusion of anadditional IMU 48 enables mixed reality programs to isolate input types. For example, afirst IMU 48 on the user's face detects orientation and positioning of the user's head, asecond IMU 48 on the user's neck detects motion of the body. - The distinction between the two input types corresponds to input provided in many known controller based video games—one control to move a player's character (commonly left control stick or WASD keys), and a second control to direct the player character's point of view (commonly right control stick or mouse cursor). Isolating user input into more than one type and from multiple positions may provide increased performance. For example, if a user whips their head forward very quickly, the
HMD 20 will not interpret this as forward motion, because the neck-mountedIMU 48 did not move. - In some embodiments, the
IMU 48 includes matching components in both the face-mountedunit 22 and the neck-mountedunit 24 that make use of magnetic fields to determine their relation in positioning to one another. This provides high-performance detection for user had positioning and orientation. - The neck-mounted
unit 24 includes the remaining components necessary to operate theHMD 20. Notably, these components include abattery 58 in some form of controller or processing unit. The controller may include aCPU 60, and/or aGPU 62. In some embodiments, processing features are not moved to the neck-mountedunit 24. Processors such as theCPU 60 and theGPU 62 may be built with a very small profile and do not weigh very much. Accordingly, the processing components may be placed in the face-mountedunit 22 or an additional unit, such as one positioned at the back of the user's head. - A number of other components may optionally be positioned in the neck-mounted
unit 24. Such components include memory orstorage space 64, a cooling system (fan) 66, ahaptic feedback system 68, the simulated weather system (air jets/water jets) 70, speakers/subwoofers 72, a wireless transceiver (Bluetooth, Wi-Fi, near field communication, etc.), or other suitable devices known in the art. Components such as thebattery 58 orprocessing cores HMD 20 heat management poses limitations for the processing power of theHMD 20. The ability to include afan 66 can improve the overall processing power of theHMD 20. - Haptic feedback systems are often not implemented on head mounted devices for comfort reasons. Placement of the
haptic feedback system 68 on the neck-mountedunit 24 increases functionality without increasing user discomfort. Similarly, including simulated weather/environment systems 70, such as air jets or water jets, on a face-mounted device is difficult. This is due to the difficulty in obtaining a good angle at which to shoot the jet at the user. When mounted on the neck, theHMD 20 has more ability to angle the jets in usable ways. Air or fluid jets may have additional uses beyond replicating micro-weather. For example, the weather/environment system 70 may also create the sense of motion or scent. Perfumes or scented sprays may be emitted from theenvironment system 70. Further air jets can simulate the sensation of not only the movement or air, but the user's movement through the air (e.g., as if on a virtual motorcycle). - Similarly,
subwoofers 72 or other larger speakers that are often difficult to include on a compact, face mounted device. The neck-mounted unit provides more surface area and an internal volume (to resonate sound waves) which in turn makes the use of subwoofers more feasible. - Wireless communication is also a relevant portion of modern computing and gaming. Thus, the inclusion of a
wireless transceiver 74 in HMDs is beneficial. As the wireless communicator does not necessarily have to be mounted on the face of the user, optional placement on the neck reduces face-mounted components. Thewireless transceiver 74 enables theHMD 20 to communicate with external networks and the Internet or additional peripherals, such as handheld controllers. -
FIG. 6 is a block diagram including a second embodiment of component positioning in ahybrid HMD 20.FIG. 6 is similar toFIG. 5 with the addition of wireless communicators in both the face-mountedunit 22 in the neck-mountedunit 24. In some embodiments, communication between bothunits wireless transceiver 74 of the neck-mountedunit 24 transmits signals to and from a face mountedwireless transceiver 76. Where power is not transferred, the face-mountedunit 22 additionally requires a face mountedbattery 78 to operate. - The
HMD 20 may additionally communicate with outside peripherals, such as acontroller 80. Communication with thecontroller 80 may be wired or wireless. -
FIG. 7 illustrates an adjustable near-eye display visor in a number of configurations. An additional issue in HMD wearability is the ability for the user to return to the real-world conveniently. HMDs generally have a snug fit on the user's head. In some circumstances it inconvenient to take off the HMD entirely. Rather is preferable for the user to briefly be able to view the real-world again before returning to an immersive VR experience. - The
adjustable HMD 82 includes anadjustable visor 84 attached to ahead mount 86 via ahinge 88. Theadjustable visor 84 includes the near-eye display of the HMD device. In use, theadjustable visor 84 is positioned in front of the user's eyes. When the user wishes to view the real-world again without taking theadjustable HMD 82 off, the user lifts theadjustable visor 84 to a raised position on their head and locks the visor in position via thehinge 88. The head mount 86 wraps fully around the user's head such that thevisor 84 is not required to stabilize theadjustable HMD 82. The features of theadjustable HMD 82 may be used with the features of thehybrid HMD 20 such that anadjustable visor 84 is included in the same HMD as a neck-mountedunit 24. -
FIG. 8 illustrates a hinge mechanism for an adjustable near-eye display device.FIG. 8 includes the sameadjustable HMD 82 is the views ofFIG. 7 , though with increased focus on thehinge 88. Further illustrated are a number of the sensory suite components of the face-mountedunit 22 of thehybrid HMD 20 such as thefront facing camera 42 andspeakers 56. -
FIG. 9 is a block diagram illustrating a close up of an electrical connection within thehinge mechanism 88. In some embodiments, HMD components are located external to thevisor 84. In order to accommodate those components with a wired connection there needs to be electrical conductivity through thehinge 88. There are a number of means of transmitting on digital signals and power through a hinge. - One such means includes a contact surface 89 between an
inner ring 90 andouter ring 92 of thehinge 88. Each of thehead mount 86 and theadjustable visor 84 is associated with one of either the inner ring or theouter ring adjustable HMD 82 to therings - An alternative means is to make use of the
central area 94 of thehinge 88. In some embodiments, a wire merely runs through thecentral area 94. Alternatively, each side of thecentral area 94, respectively one side for theadjustable visor 84 and one side for thehead mount 86, includes a contact surface. -
FIG. 10 is a flowchart illustrating a method for communicating between a face-mounted unit and a neck mounted-unit. When both a face-mounted unit and a neck-mounted unit are used, communication between the two is necessary. The connection between the two may be wired or wireless, and may pass either data, power, or both data and power. Additionally, if only one of the units includes a power source, power must be shared between the units. Instep 1002, the HMD Displays a mixed reality experience on a near-eye display mounted on a head of a user to the user. Instep 1004, a neck-mounted unit transmits power and electrical signals to the near-eye display via a connection, wherein the neck-mounted unit includes a battery and a processor. - Although the subject matter has been described in language specific to structural features and/or acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as examples of implementing the claims and other equivalent features and acts are intended to be within the scope of the claims.
Claims (20)
1. A head mounted display comprising:
a face-mounted unit, configured to mount on a face of a user, including:
a display; and
a neck-mounted unit configured to wrap around a neck of the user and communicatively coupled to the face unit, the neck-mounted unit having a single hollow volume that encircles the neck of the user, the hollow volume encasing:
a controller; and
a battery.
2. The head mounted display of claim 1 , wherein the communicative connection between the face-mounted unit and the neck-mounted unit is implemented via a wired connection.
3. The head mounted display of claim 1 , wherein the communicative connection between the face-mounted unit and the neck-mounted unit is implemented via a wireless connection, and the face-mounted unit further comprises:
a wireless transceiver; and
a second battery that powers the wireless transceiver and the display.
4. The head mounted display of claim 1 , wherein the controller further comprises any of:
a central processing unit; or
a graphics processing unit.
5. The head mounted display of claim 4 , wherein the battery provides the majority of the power consumed by any of:
the display;
the central processing unit; or
the graphics processing unit.
6. The head mounted display of claim 1 , the face-mounted unit further comprising any of:
a binocular optical module;
a camera sensor;
an inertial measurement sensor;
a microphone;
a radar sensor;
a LIDAR sensor;
a proximity sensor;
an eye gaze sensor; or
speakers.
7. The head mounted display of claim 6 , further comprising:
a sensor fusion system that is configured to fuse sensor data locally collected from the face-mounted unit into an input stream and communicates the input stream to the processor.
8. The head mounted display of claim 1 , further comprising any of:
buttons; or
a peripheral control device.
9. The head mounted display of claim 1 , wherein the neck-mounted unit further comprises any of:
a cooling system;
a haptic feedback system;
speakers;
an air jet system;
a memory; or
a wireless transceiver.
10. The head mounted display of claim 1 , wherein the neck-mounted unit is shaped in a necklace configuration.
11. The head mounted display of claim 10 , wherein the battery and the controller are positioned in a pendant module on the necklace configuration.
12. The head mounted display of claim 1 , wherein the neck-mounted unit is shaped in a mantle configuration and is further supported by shoulders of the user.
13. The head mounted display of claim 1 , further comprising:
a rear processing unit mounted behind the head of the user and including any of:
a graphics processing unit; or
a central processing unit.
14. A mixed reality device comprising:
a near-eye display mounted on a head of a user;
a neck-mounted unit including a battery, a processor, and a cooling unit, wherein the battery provides power for both the processor and the near-eye display and the processor provides graphical processing for the near-eye display to render, and the cooling unit manages temperatures for the battery and the processor; and
a connection between the near-eye display and the neck-mounted unit that transmits power and electrical signals there between.
15. The mixed reality device of claim 14 , the near-eye display further comprising any of:
a binocular optical module;
a camera sensor;
an inertial measurement sensor;
a microphone;
a radar sensor;
a LIDAR sensor;
a proximity sensor;
an eye gaze sensor; or
speakers.
16. The mixed reality device of claim 14 , wherein the neck-mounted unit further comprises any of:
a haptic feedback system;
speakers;
an air jet system;
a memory; or
a wireless transceiver.
17. The mixed reality device of claim 14 , wherein the neck-mounted unit is shaped in a necklace configuration.
18. The mixed reality device of claim 17 , wherein the battery and the processor are positioned in a pendant module on the necklace configuration.
19. The mixed reality device of claim 14 , wherein the neck-mounted unit is shaped in a mantle configuration and is further supported by shoulders of the user.
20. A method of operating a mixed reality device comprising:
displaying a mixed reality experience on a near-eye display mounted on a head of a user to the user; and
transmitting power and electrical signals between a neck-mounted unit and the near-eye display via a connection, wherein the neck-mounted unit includes a battery and a processor contained within a single hollow volume that encircles a neck of the user.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/628,560 US20180365900A1 (en) | 2017-06-20 | 2017-06-20 | Mixed Reality Head Mounted Display Device |
US15/630,292 US10401913B2 (en) | 2017-06-20 | 2017-06-22 | Mixed reality head mounted display device |
CN201810637206.3A CN109100865A (en) | 2017-06-20 | 2018-06-20 | Mixed reality head-wearing display device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/628,560 US20180365900A1 (en) | 2017-06-20 | 2017-06-20 | Mixed Reality Head Mounted Display Device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/630,292 Continuation US10401913B2 (en) | 2017-06-20 | 2017-06-22 | Mixed reality head mounted display device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180365900A1 true US20180365900A1 (en) | 2018-12-20 |
Family
ID=64658023
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/628,560 Abandoned US20180365900A1 (en) | 2017-06-20 | 2017-06-20 | Mixed Reality Head Mounted Display Device |
US15/630,292 Expired - Fee Related US10401913B2 (en) | 2017-06-20 | 2017-06-22 | Mixed reality head mounted display device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/630,292 Expired - Fee Related US10401913B2 (en) | 2017-06-20 | 2017-06-22 | Mixed reality head mounted display device |
Country Status (2)
Country | Link |
---|---|
US (2) | US20180365900A1 (en) |
CN (1) | CN109100865A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109631887A (en) * | 2018-12-29 | 2019-04-16 | 重庆邮电大学 | Inertial navigation high-precision locating method based on binocular, acceleration and gyroscope |
CN110320665A (en) * | 2019-03-20 | 2019-10-11 | 郑州铁路职业技术学院 | A kind of VR glasses that adjustable wearing is comfortable |
CN110672097A (en) * | 2019-11-25 | 2020-01-10 | 北京中科深智科技有限公司 | Indoor positioning and tracking method, device and system based on laser radar |
CN111208646A (en) * | 2020-03-04 | 2020-05-29 | 杭州光粒科技有限公司 | Head Mounted Displays and Wearables |
CN115561904A (en) * | 2022-10-11 | 2023-01-03 | 常州工学院 | An adjustable VR display device |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6502989B2 (en) * | 2017-03-27 | 2019-04-17 | 本田技研工業株式会社 | Head mounted display |
US11402925B2 (en) * | 2019-09-25 | 2022-08-02 | Apple Inc. | Crown input and feedback for head-mountable devices |
CN111025641B (en) * | 2019-12-20 | 2021-08-03 | 中国地质大学(武汉) | A mixed reality display device based on eye tracker |
US11789276B1 (en) * | 2020-04-06 | 2023-10-17 | Apple Inc. | Head-mounted device with pivoting connectors |
US11528953B2 (en) | 2020-05-19 | 2022-12-20 | Rockwell Collins, Inc. | Display embedded visor helmet mounted display |
CN112558306B (en) * | 2020-12-23 | 2023-06-30 | Oppo广东移动通信有限公司 | Neckworn and Wearables |
US20220299792A1 (en) * | 2021-03-18 | 2022-09-22 | Meta Platforms Technologies, Llc | Lanyard for smart frames and mixed reality devices |
EP4418072A4 (en) * | 2021-12-08 | 2025-02-19 | Samsung Electronics Co Ltd | PORTABLE ELECTRONIC DEVICE WITH IMPLEMENTED DISTRIBUTION SYSTEM FOR CONTENT AND VISION PROCESSING |
CN117687221B (en) * | 2024-02-04 | 2024-04-12 | 中国民用航空飞行学院 | VR glasses based on flight simulation inspection uses |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120075169A1 (en) * | 2010-09-29 | 2012-03-29 | Olympus Corporation | Head-mounted display |
US20120249741A1 (en) * | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Anchoring virtual images to real world surfaces in augmented reality systems |
US8352046B1 (en) * | 2009-01-30 | 2013-01-08 | Advanced Bionics, Llc | Sound processing assembly for use in a cochlear implant system |
US20160070110A1 (en) * | 2014-04-09 | 2016-03-10 | Alexey Leonidovich Ushakov | Composite wearable electronic communication device |
US20170068119A1 (en) * | 2014-02-19 | 2017-03-09 | Evergaze, Inc. | Apparatus and Method for Improving, Augmenting or Enhancing Vision |
US20170274282A1 (en) * | 2016-03-23 | 2017-09-28 | Intel Corporation | Immersive gaming |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100259471A1 (en) * | 2007-11-16 | 2010-10-14 | Nikon Corporation | Control device, head-mount display device, program, and control method |
US20110090135A1 (en) * | 2009-10-21 | 2011-04-21 | Symbol Technologies, Inc. | Interchangeable display device for a head-mounted display system |
KR102218913B1 (en) * | 2014-09-04 | 2021-02-23 | 엘지전자 주식회사 | Smart bracelet |
DE202014105859U1 (en) | 2014-12-04 | 2015-04-17 | Asia Vital Components Co., Ltd. | Cooling structure of Wearable Smart Device |
US20170351096A1 (en) * | 2016-06-01 | 2017-12-07 | Cheng-Ho Tsai | Head-mounted equipment capable of displaying images |
-
2017
- 2017-06-20 US US15/628,560 patent/US20180365900A1/en not_active Abandoned
- 2017-06-22 US US15/630,292 patent/US10401913B2/en not_active Expired - Fee Related
-
2018
- 2018-06-20 CN CN201810637206.3A patent/CN109100865A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8352046B1 (en) * | 2009-01-30 | 2013-01-08 | Advanced Bionics, Llc | Sound processing assembly for use in a cochlear implant system |
US20120075169A1 (en) * | 2010-09-29 | 2012-03-29 | Olympus Corporation | Head-mounted display |
US20120249741A1 (en) * | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Anchoring virtual images to real world surfaces in augmented reality systems |
US20170068119A1 (en) * | 2014-02-19 | 2017-03-09 | Evergaze, Inc. | Apparatus and Method for Improving, Augmenting or Enhancing Vision |
US20160070110A1 (en) * | 2014-04-09 | 2016-03-10 | Alexey Leonidovich Ushakov | Composite wearable electronic communication device |
US20170274282A1 (en) * | 2016-03-23 | 2017-09-28 | Intel Corporation | Immersive gaming |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109631887A (en) * | 2018-12-29 | 2019-04-16 | 重庆邮电大学 | Inertial navigation high-precision locating method based on binocular, acceleration and gyroscope |
CN110320665A (en) * | 2019-03-20 | 2019-10-11 | 郑州铁路职业技术学院 | A kind of VR glasses that adjustable wearing is comfortable |
CN110672097A (en) * | 2019-11-25 | 2020-01-10 | 北京中科深智科技有限公司 | Indoor positioning and tracking method, device and system based on laser radar |
CN111208646A (en) * | 2020-03-04 | 2020-05-29 | 杭州光粒科技有限公司 | Head Mounted Displays and Wearables |
CN115561904A (en) * | 2022-10-11 | 2023-01-03 | 常州工学院 | An adjustable VR display device |
Also Published As
Publication number | Publication date |
---|---|
US20180364766A1 (en) | 2018-12-20 |
US10401913B2 (en) | 2019-09-03 |
CN109100865A (en) | 2018-12-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10401913B2 (en) | Mixed reality head mounted display device | |
US11805347B2 (en) | Display system having an audio output device | |
EP3884335B1 (en) | Systems and methods for maintaining directional wireless links of motile devices | |
US11055056B1 (en) | Split system for artificial reality | |
US11366522B1 (en) | Systems and methods for providing substantially orthogonal movement of a device about a user's body part | |
WO2017124724A1 (en) | Head-mounted display | |
CN205620609U (en) | Modularization wear -type electronic equipment | |
CN205656368U (en) | Head-mounted virtual reality audio-visual device | |
US10536666B1 (en) | Systems and methods for transmitting aggregated video data | |
US11990689B2 (en) | Antenna system for wearable devices | |
CN204855941U (en) | Wear -type virtual reality equipment and system | |
US11522841B1 (en) | Third-party data manipulation with privacy controls | |
US11816886B1 (en) | Apparatus, system, and method for machine perception | |
CN205485063U (en) | Head display | |
Hussein | Wearable computing: Challenges of implementation and its future | |
US12034200B1 (en) | Integrated camera antenna | |
US20200251071A1 (en) | Wearable Device and Method Therein | |
US11287885B1 (en) | Apparatus, system, and method for determining the position of wearables donned by users of artificial reality systems | |
CN208013548U (en) | Reality enhancing glasses | |
CN205594225U (en) | Novel virtual reality 3D glasses | |
CN119211512B (en) | Augmented reality information notification method and device, electronic equipment and storage medium | |
US20240184250A1 (en) | Antenna systems with an extended ground in a detachable cradle | |
US20230327328A1 (en) | Antenna system for mobile devices | |
US20240346221A1 (en) | Circuits and methods for reducing the effects of variation in inter-die communication in 3d-stacked systems | |
US20240097311A1 (en) | Antenna for wearable electronic devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |