+

US20180365900A1 - Mixed Reality Head Mounted Display Device - Google Patents

Mixed Reality Head Mounted Display Device Download PDF

Info

Publication number
US20180365900A1
US20180365900A1 US15/628,560 US201715628560A US2018365900A1 US 20180365900 A1 US20180365900 A1 US 20180365900A1 US 201715628560 A US201715628560 A US 201715628560A US 2018365900 A1 US2018365900 A1 US 2018365900A1
Authority
US
United States
Prior art keywords
neck
user
unit
display
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/628,560
Inventor
Jonathan Mendoza
Toby Stopper
Kevin Hoffman
Ignazio Moresco
Allen Yang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immerex Inc
Original Assignee
Immerex Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immerex Inc filed Critical Immerex Inc
Priority to US15/628,560 priority Critical patent/US20180365900A1/en
Priority to US15/630,292 priority patent/US10401913B2/en
Priority to CN201810637206.3A priority patent/CN109100865A/en
Publication of US20180365900A1 publication Critical patent/US20180365900A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1679Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for locking or maintaining the movable parts of the enclosure in a fixed position, e.g. latching mechanism at the edge of the display in a laptop or for the screen protective cover of a PDA
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1635Details related to the integration of battery packs and other power supplies such as fuel cells or integrated AC adapter
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1654Details related to the display arrangement, including those related to the mounting of the display in the housing the display being detachable, e.g. for remote use
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1681Details related solely to hinges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0154Head-up displays characterised by mechanical features with movable elements
    • G02B2027/0156Head-up displays characterised by mechanical features with movable elements with optionally usable elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • components such as the display 38 , the binocular optical module 40 , and the speakers 56 provide output to the user.
  • Components such as the camera 42 , the proximity sensor 44 , the radar 52 , or the LIDAR 54 are used to provide input to the HMD 20 that is environmentally based (i.e., not directly provided by the user). While these environmentally based input sensors may also be positioned on the neck-mounted unit 24 , positioning the sensors on the user's face enable them to capture the environment within the line of sight of the user despite the orientation of the user's body.
  • the IMU 48 includes matching components in both the face-mounted unit 22 and the neck-mounted unit 24 that make use of magnetic fields to determine their relation in positioning to one another. This provides high-performance detection for user had positioning and orientation.
  • subwoofers 72 or other larger speakers that are often difficult to include on a compact, face mounted device.
  • the neck-mounted unit provides more surface area and an internal volume (to resonate sound waves) which in turn makes the use of subwoofers more feasible.
  • FIG. 9 is a block diagram illustrating a close up of an electrical connection within the hinge mechanism 88 .
  • HMD components are located external to the visor 84 . In order to accommodate those components with a wired connection there needs to be electrical conductivity through the hinge 88 . There are a number of means of transmitting on digital signals and power through a hinge.
  • FIG. 10 is a flowchart illustrating a method for communicating between a face-mounted unit and a neck mounted-unit.
  • the connection between the two may be wired or wireless, and may pass either data, power, or both data and power. Additionally, if only one of the units includes a power source, power must be shared between the units.
  • the HMD Displays a mixed reality experience on a near-eye display mounted on a head of a user to the user.
  • a neck-mounted unit transmits power and electrical signals to the near-eye display via a connection, wherein the neck-mounted unit includes a battery and a processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Power Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed are a Head mounted Display (HMD) device for use in mixed reality applications such as virtual or augmented reality. The disclosed HMD includes improvement to comfort and convenience of wearing. Convenience features include shifting portions of the HMD's heat generating and heavier components off the user's head and on to their neck, or neck and shoulders. Shifting components to the user's neck opens additional input schemes and enables the use of further feedback devices that improve the immersiveness of the mixed reality experience. An additional convenience feature is an adjustable visor that lifts up and down such that the user may view the real-world or the virtual-world without having to remove the HMD fully from their heads.

Description

    TECHNICAL FIELD
  • This disclosure relates to head mounted display devices and more particularly to the physical structure thereof.
  • BACKGROUND
  • Virtual reality (VR) and augmented reality (AR) visualization systems are starting to enter the mainstream consumer electronics marketplace. These devices are often bulky and limit the ability of the user to move comfortably or see.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • One or more embodiments of the present disclosure are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements.
  • FIG. 1 illustrates an example of an environment including an HMD device.
  • FIG. 2 illustrates a first embodiment of a hybrid, face and neck display device.
  • FIG. 3 illustrates a second embodiment of a hybrid, face and neck display device.
  • FIG. 4A illustrates a hybrid HMD device in a necklace configuration.
  • FIG. 4B illustrates a hybrid HMD device in a necklace and pendant configuration.
  • FIG. 4C illustrates a hybrid HMD device in a mantle configuration.
  • FIG. 5 is a block diagram including a first embodiment of component positioning in a hybrid HMD.
  • FIG. 6 is a block diagram including a second embodiment of component positioning in a hybrid HMD.
  • FIG. 7 illustrates an adjustable near-eye display visor in a number of configurations.
  • FIG. 8 illustrates a hinge mechanism for an adjustable near-eye display device.
  • FIG. 9 is a block diagram illustrating a close up of an electrical connection within the hinge mechanism.
  • FIG. 10 is a flowchart illustrating a method for communicating between a face-mounted unit and a neck mounted-unit.
  • DETAILED DESCRIPTION
  • In this description, references to “an embodiment,” “one embodiment” or the like, mean that the particular feature, function, structure or characteristic being described is included in at least one embodiment of the technique introduced here. Occurrences of such phrases in this specification do not necessarily all refer to the same embodiment. On the other hand, the embodiments referred to also are not necessarily mutually exclusive.
  • Head mounted displays (HMD) are often used for virtual reality or augmented reality applications, often inclusively referred to as mixed reality. An engineering concern HMDs is often the ease of wearing. HMD's may be physically and visually restrictive, heavy, awkward, bulky hot, and disorienting. The present disclosure includes improvements upon these issues.
  • One way to improve upon mobility and comfort is to remove components from the user's head. This is sometimes done with backpacks or connections to external computers. However, these methods make it more difficult to market, sell all-in-one HMD devices, and cause the user to be tethered to external devices. The primary issue is removing weight, bulk, and heat from the user's head or face area. To achieve that, it is unnecessary to go so far as placing it on their back. Instead, the components that do not necessarily need to be on the user's face may be moved to their neck, and worn as a necklace.
  • Another issue is visibility. This issue is more particularly noticeable in immersive virtual reality applications. Current designs must be taken all the way off in order for a user to see the real world again. Thus having a display that can be adjusted to either be in front of the user's face (in use) or propped up on their forehead as if the HMD were a pair of glasses enables the user to view the real world again without having to concern themselves with tightness settings.
  • FIG. 1 shows an example of an environment including a HMD device 1 that can implement the techniques introduced here. In the illustrated example, the HMD device 10 is configured to communicate data to and from a processing system 12 through a connection 14, which can be a wired connection, a wireless connection, or a combination thereof. In some use cases, the HMD device 10 may operate as a standalone device with an integrated processing system 12. In some use cases, the processing system 12 is external to the HMD 10.
  • The connection 14 can be configured to carry any kind of data, such as image data (e.g., still images and/or full-motion video, including 2D and 3D images), audio data (including voice), multimedia, and/or any other type(s) of data. The processing device 12 may be, for example, a game console, personal computer, tablet computer, smartphone, or other type of processing device. The connection 14 can be, for example, a universal serial bus (USB) connection, Wi-Fi connection, Bluetooth or Bluetooth Low Energy (BLE) connection, Ethernet connection, cable connection, DSL connection, cellular connection (e.g., 3G, LTE/4G or 5G), or the like, or a combination thereof. Additionally, the processing device 12 may communicate with one or more other processing systems 5 via a network 4, which may be or include, for example, a local area network (LAN), a wide area network (WAN), an intranet, a metropolitan area network (MAN), the global Internet, or a combination thereof.
  • The processing system 12 may further connect to a network 16, such as the Internet, a local area network (LAN), or a virtual private network (VPN). Through the network 16, the HMD device 10 may make use of a secondary processing systems 18.
  • FIG. 2 illustrates a first embodiment of a hybrid, face and neck display device 20 (hybrid device). The hybrid device has a face-mounted unit 22 and a neck-mounted unit 24. The face-mounted unit 22 includes components that are sensory and control related (e.g., display, motion sensors, etc. . . . ). Conversely, the neck-mounted unit 24 includes the bulkiest components (e.g., the batteries). Some components are neither bulky/heavy but are also not relevant to sensory and controls (e.g., processors such as the CPU/GPU). These components may be places in either unit 22, 24 or both. In some embodiments, components not related to sensory or control apparatus are positioned in the neck-mounted unit 24.
  • The hybrid device 20 includes some additional enabling components. One such enabling component is a means for mounting the face-mounted unit 22 to the user's head/face. Such means is displayed in FIG. 2 via straps 26. Other than straps 26, the means for mounting may include a headband, a halo, clamps/clips, or a hat/helmet. The face and neck mounted units 22, 24 are communicatively connected to one another. This connection may be either wired or wireless. In FIG. 2, the connection is a cable 28. The cable carries both power and data between the units 22, 24.
  • The neck-mounted unit 24 in FIG. 2 is in a necklace configuration including a pendant 30. The necklace configuration may be wrapped, slid, or clamped around the user's neck. FIG. 2 also includes a face-unit mounted camera 32. The face-mounted unit 22 may include a number of sensors.
  • An advantage of the displayed embodiment is that the face-mounted unit is lighter and thus will fit more comfortably on the user's face. This enables the user to be more mobile during use. The neck-mounted unit 24 is significantly lighter than a backpack, but still enables weight to be removed from the face-mounted unit 22. Additionally, a more robust cooling system may be employed on the neck-mounted unit 24 than would otherwise be comfortable on a face-mounted unit 22. Vibrations created by a fan are more noticeable and irritating as felt through the skull than on the clavicle. Vibrations received through the skull are often audible (e.g., music played through a metal rod that is clenched in the teeth can be heard). Conversely, a user cannot hear weak vibrations through the clavicle.
  • FIG. 3 illustrates a second embodiment of a hybrid, face and neck display device 20A. The figure is similar to the embodiment displayed in FIG. 2. Displayed is an alternate embodiment 20A of the means of mounting the face-mounted unit 22.
  • The “4” Figure series illustrates a number of embodiments for the neck-mounted unit 24.
  • FIG. 4A illustrates a hybrid HMD device in a necklace configuration 34. The necklace configuration 34 wraps around the user's neck much like a necklace. In some embodiments, the necklace configuration 34 completely encircles the user's neck, once clasped. In other embodiments, the necklace configuration wraps partially around the user's neck. There are a number of ways of securing the necklace configuration 34 on the user including clasps, force fit, magnets, a hinge, or other methods known in adorning neck mounted wearables.
  • The necklace configuration 34 may position components within a hollow volume of the neck-mounted unit. The exact positioning of each component within the hollow volume may vary. In some embodiments, the components within the hollow volume are positioned to balance weight across the entire neck-mounted unit.
  • FIG. 4B illustrates a hybrid HMD device in a necklace and pendant configuration. The neck-mounted unit 24 is shown with a necklace configuration 34 and a pendant 30. The pendant 30 is primarily used for component storage. In this configuration the bulk of the weight of the neck mounted unit 24 is within the pendant 30. The pendant 30 may be constructed a number of ways including a fixed position in front of the user, a dangling position from the necklace 34 (including wired communication there between), multiple pendants, or other methods known in the art of adorning worn articles around the neck.
  • The pendant itself does not necessarily have to be positioned on the front, center of the user's neck. The weight of the components may be distributed at the back, on the sides, or evenly on either side.
  • FIG. 4C illustrates a hybrid HMD device in a mantle configuration 36. The mantle configuration 36 is notably more robust than either the necklace configuration 34, or the necklace and pendant configuration 30. The mantle configuration 36 expands across, and derives support from the wearer's shoulders. Use of a mantle shaped adornment provide the digital space in a hollow volume in which to position components, while still retaining a smaller profile that a backpack.
  • While other configurations may include the same additional components, the mantle configuration 36 is more readily configured for haptic feedback, jets of air or water used for improving an immersive experience, and speakers including bass tones or subwoofers. While it might not be preferable to feel vibration from a fan on one's skull (thereby “hearing” the fan), experiencing the pulsing of a subwoofer on one's shoulders can improve an immersive experience. The additional immersive experience components may be positioned on one or both shoulders of the mantle configuration 36.
  • FIG. 5 is a block diagram including a first embodiment of component positioning in a hybrid HMD 20 between the face-mounted unit 22 and the neck-mounted unit 24. The overall scheme in component positioning, with some exceptions, is to remove all components that do not need to be on the user's face, from the user's face. One of the primary components in an HMD is the display 38. The display 38, in operation, needs to be in front of the user's eyes, and therefore must remain on the face-mounted unit 22. In some embodiments, the display 38 is a portion of the mobile device such as a cell phone or tablet. In such embodiments, the face-mounted unit 22 comprises a housing where the mobile device may be inserted into the housing during use. Additionally, the face-mounted unit 22 may include a suite of sensors. The face-mounted unit 22 uses the sensors to either accept input or render output for user.
  • Sensors in the sensor suite include a binocular optical module 40, a front facing camera 42, the proximity sensor 44, an eye-gaze sensor 46, and inertial measurement unit (IMU) 48 or accelerometer, a microphone 50, a radar sensor 52, a LIDAR sensor 54, speakers 56, or other sensory equipment known in the art. The binocular optical module 40 refers to a set of lenses that adapt a display to be suited for view separately in each of two eyes, and very close up.
  • While these components may be used for multiple purposes, components such as the display 38, the binocular optical module 40, and the speakers 56 provide output to the user. Components such as the camera 42, the proximity sensor 44, the radar 52, or the LIDAR 54 are used to provide input to the HMD 20 that is environmentally based (i.e., not directly provided by the user). While these environmentally based input sensors may also be positioned on the neck-mounted unit 24, positioning the sensors on the user's face enable them to capture the environment within the line of sight of the user despite the orientation of the user's body.
  • Finally, components such as the eye-gaze sensor 46, the IMU 48, and the microphone 50 each collect direct user input. The eye gaze sensor 46 must be positioned on the face-mounted unit 22 and within proximity of the user's eyes in order to detect where the user's eyes are looking on the display. The microphone 50 may be positioned on the neck-mounted unit 24; however, placing the microphone closer to the user's mouth (i.e., their face) improves microphone performance.
  • The IMU 48 is positioned on the face-mounted unit 22 in order to detect the motion of the user's head. One advantage of moving heavy components from the user's face to their neck is that movement of the user's head will feel more natural. Detecting the movement of the user's head is important element in maintaining an immersive experience. The neck mounted unit 24 may also include IMU 48. Such inclusion of an additional IMU 48 enables mixed reality programs to isolate input types. For example, a first IMU 48 on the user's face detects orientation and positioning of the user's head, a second IMU 48 on the user's neck detects motion of the body.
  • The distinction between the two input types corresponds to input provided in many known controller based video games—one control to move a player's character (commonly left control stick or WASD keys), and a second control to direct the player character's point of view (commonly right control stick or mouse cursor). Isolating user input into more than one type and from multiple positions may provide increased performance. For example, if a user whips their head forward very quickly, the HMD 20 will not interpret this as forward motion, because the neck-mounted IMU 48 did not move.
  • In some embodiments, the IMU 48 includes matching components in both the face-mounted unit 22 and the neck-mounted unit 24 that make use of magnetic fields to determine their relation in positioning to one another. This provides high-performance detection for user had positioning and orientation.
  • The neck-mounted unit 24 includes the remaining components necessary to operate the HMD 20. Notably, these components include a battery 58 in some form of controller or processing unit. The controller may include a CPU 60, and/or a GPU 62. In some embodiments, processing features are not moved to the neck-mounted unit 24. Processors such as the CPU 60 and the GPU 62 may be built with a very small profile and do not weigh very much. Accordingly, the processing components may be placed in the face-mounted unit 22 or an additional unit, such as one positioned at the back of the user's head.
  • A number of other components may optionally be positioned in the neck-mounted unit 24. Such components include memory or storage space 64, a cooling system (fan) 66, a haptic feedback system 68, the simulated weather system (air jets/water jets) 70, speakers/subwoofers 72, a wireless transceiver (Bluetooth, Wi-Fi, near field communication, etc.), or other suitable devices known in the art. Components such as the battery 58 or processing cores 60, 62 often generate a lot of heat. In addition to management of the weight of the HMD 20 heat management poses limitations for the processing power of the HMD 20. The ability to include a fan 66 can improve the overall processing power of the HMD 20.
  • Haptic feedback systems are often not implemented on head mounted devices for comfort reasons. Placement of the haptic feedback system 68 on the neck-mounted unit 24 increases functionality without increasing user discomfort. Similarly, including simulated weather/environment systems 70, such as air jets or water jets, on a face-mounted device is difficult. This is due to the difficulty in obtaining a good angle at which to shoot the jet at the user. When mounted on the neck, the HMD 20 has more ability to angle the jets in usable ways. Air or fluid jets may have additional uses beyond replicating micro-weather. For example, the weather/environment system 70 may also create the sense of motion or scent. Perfumes or scented sprays may be emitted from the environment system 70. Further air jets can simulate the sensation of not only the movement or air, but the user's movement through the air (e.g., as if on a virtual motorcycle).
  • Similarly, subwoofers 72 or other larger speakers that are often difficult to include on a compact, face mounted device. The neck-mounted unit provides more surface area and an internal volume (to resonate sound waves) which in turn makes the use of subwoofers more feasible.
  • Wireless communication is also a relevant portion of modern computing and gaming. Thus, the inclusion of a wireless transceiver 74 in HMDs is beneficial. As the wireless communicator does not necessarily have to be mounted on the face of the user, optional placement on the neck reduces face-mounted components. The wireless transceiver 74 enables the HMD 20 to communicate with external networks and the Internet or additional peripherals, such as handheld controllers.
  • FIG. 6 is a block diagram including a second embodiment of component positioning in a hybrid HMD 20. FIG. 6 is similar to FIG. 5 with the addition of wireless communicators in both the face-mounted unit 22 in the neck-mounted unit 24. In some embodiments, communication between both units 22, 24 is wireless rather than wired. Wireless communication may include data, power, or both. In order to facilitate wireless communication, the wireless transceiver 74 of the neck-mounted unit 24 transmits signals to and from a face mounted wireless transceiver 76. Where power is not transferred, the face-mounted unit 22 additionally requires a face mounted battery 78 to operate.
  • The HMD 20 may additionally communicate with outside peripherals, such as a controller 80. Communication with the controller 80 may be wired or wireless.
  • FIG. 7 illustrates an adjustable near-eye display visor in a number of configurations. An additional issue in HMD wearability is the ability for the user to return to the real-world conveniently. HMDs generally have a snug fit on the user's head. In some circumstances it inconvenient to take off the HMD entirely. Rather is preferable for the user to briefly be able to view the real-world again before returning to an immersive VR experience.
  • The adjustable HMD 82 includes an adjustable visor 84 attached to a head mount 86 via a hinge 88. The adjustable visor 84 includes the near-eye display of the HMD device. In use, the adjustable visor 84 is positioned in front of the user's eyes. When the user wishes to view the real-world again without taking the adjustable HMD 82 off, the user lifts the adjustable visor 84 to a raised position on their head and locks the visor in position via the hinge 88. The head mount 86 wraps fully around the user's head such that the visor 84 is not required to stabilize the adjustable HMD 82. The features of the adjustable HMD 82 may be used with the features of the hybrid HMD 20 such that an adjustable visor 84 is included in the same HMD as a neck-mounted unit 24.
  • FIG. 8 illustrates a hinge mechanism for an adjustable near-eye display device. FIG. 8 includes the same adjustable HMD 82 is the views of FIG. 7, though with increased focus on the hinge 88. Further illustrated are a number of the sensory suite components of the face-mounted unit 22 of the hybrid HMD 20 such as the front facing camera 42 and speakers 56.
  • FIG. 9 is a block diagram illustrating a close up of an electrical connection within the hinge mechanism 88. In some embodiments, HMD components are located external to the visor 84. In order to accommodate those components with a wired connection there needs to be electrical conductivity through the hinge 88. There are a number of means of transmitting on digital signals and power through a hinge.
  • One such means includes a contact surface 89 between an inner ring 90 and outer ring 92 of the hinge 88. Each of the head mount 86 and the adjustable visor 84 is associated with one of either the inner ring or the outer ring 90, 92. Wiring connects each of the respective portions of the adjustable HMD 82 to the rings 90, 92. The contact surface 89 provides the necessary electrical connection there between.
  • An alternative means is to make use of the central area 94 of the hinge 88. In some embodiments, a wire merely runs through the central area 94. Alternatively, each side of the central area 94, respectively one side for the adjustable visor 84 and one side for the head mount 86, includes a contact surface.
  • FIG. 10 is a flowchart illustrating a method for communicating between a face-mounted unit and a neck mounted-unit. When both a face-mounted unit and a neck-mounted unit are used, communication between the two is necessary. The connection between the two may be wired or wireless, and may pass either data, power, or both data and power. Additionally, if only one of the units includes a power source, power must be shared between the units. In step 1002, the HMD Displays a mixed reality experience on a near-eye display mounted on a head of a user to the user. In step 1004, a neck-mounted unit transmits power and electrical signals to the near-eye display via a connection, wherein the neck-mounted unit includes a battery and a processor.
  • Although the subject matter has been described in language specific to structural features and/or acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as examples of implementing the claims and other equivalent features and acts are intended to be within the scope of the claims.

Claims (20)

1. A head mounted display comprising:
a face-mounted unit, configured to mount on a face of a user, including:
a display; and
a neck-mounted unit configured to wrap around a neck of the user and communicatively coupled to the face unit, the neck-mounted unit having a single hollow volume that encircles the neck of the user, the hollow volume encasing:
a controller; and
a battery.
2. The head mounted display of claim 1, wherein the communicative connection between the face-mounted unit and the neck-mounted unit is implemented via a wired connection.
3. The head mounted display of claim 1, wherein the communicative connection between the face-mounted unit and the neck-mounted unit is implemented via a wireless connection, and the face-mounted unit further comprises:
a wireless transceiver; and
a second battery that powers the wireless transceiver and the display.
4. The head mounted display of claim 1, wherein the controller further comprises any of:
a central processing unit; or
a graphics processing unit.
5. The head mounted display of claim 4, wherein the battery provides the majority of the power consumed by any of:
the display;
the central processing unit; or
the graphics processing unit.
6. The head mounted display of claim 1, the face-mounted unit further comprising any of:
a binocular optical module;
a camera sensor;
an inertial measurement sensor;
a microphone;
a radar sensor;
a LIDAR sensor;
a proximity sensor;
an eye gaze sensor; or
speakers.
7. The head mounted display of claim 6, further comprising:
a sensor fusion system that is configured to fuse sensor data locally collected from the face-mounted unit into an input stream and communicates the input stream to the processor.
8. The head mounted display of claim 1, further comprising any of:
buttons; or
a peripheral control device.
9. The head mounted display of claim 1, wherein the neck-mounted unit further comprises any of:
a cooling system;
a haptic feedback system;
speakers;
an air jet system;
a memory; or
a wireless transceiver.
10. The head mounted display of claim 1, wherein the neck-mounted unit is shaped in a necklace configuration.
11. The head mounted display of claim 10, wherein the battery and the controller are positioned in a pendant module on the necklace configuration.
12. The head mounted display of claim 1, wherein the neck-mounted unit is shaped in a mantle configuration and is further supported by shoulders of the user.
13. The head mounted display of claim 1, further comprising:
a rear processing unit mounted behind the head of the user and including any of:
a graphics processing unit; or
a central processing unit.
14. A mixed reality device comprising:
a near-eye display mounted on a head of a user;
a neck-mounted unit including a battery, a processor, and a cooling unit, wherein the battery provides power for both the processor and the near-eye display and the processor provides graphical processing for the near-eye display to render, and the cooling unit manages temperatures for the battery and the processor; and
a connection between the near-eye display and the neck-mounted unit that transmits power and electrical signals there between.
15. The mixed reality device of claim 14, the near-eye display further comprising any of:
a binocular optical module;
a camera sensor;
an inertial measurement sensor;
a microphone;
a radar sensor;
a LIDAR sensor;
a proximity sensor;
an eye gaze sensor; or
speakers.
16. The mixed reality device of claim 14, wherein the neck-mounted unit further comprises any of:
a haptic feedback system;
speakers;
an air jet system;
a memory; or
a wireless transceiver.
17. The mixed reality device of claim 14, wherein the neck-mounted unit is shaped in a necklace configuration.
18. The mixed reality device of claim 17, wherein the battery and the processor are positioned in a pendant module on the necklace configuration.
19. The mixed reality device of claim 14, wherein the neck-mounted unit is shaped in a mantle configuration and is further supported by shoulders of the user.
20. A method of operating a mixed reality device comprising:
displaying a mixed reality experience on a near-eye display mounted on a head of a user to the user; and
transmitting power and electrical signals between a neck-mounted unit and the near-eye display via a connection, wherein the neck-mounted unit includes a battery and a processor contained within a single hollow volume that encircles a neck of the user.
US15/628,560 2017-06-20 2017-06-20 Mixed Reality Head Mounted Display Device Abandoned US20180365900A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/628,560 US20180365900A1 (en) 2017-06-20 2017-06-20 Mixed Reality Head Mounted Display Device
US15/630,292 US10401913B2 (en) 2017-06-20 2017-06-22 Mixed reality head mounted display device
CN201810637206.3A CN109100865A (en) 2017-06-20 2018-06-20 Mixed reality head-wearing display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/628,560 US20180365900A1 (en) 2017-06-20 2017-06-20 Mixed Reality Head Mounted Display Device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/630,292 Continuation US10401913B2 (en) 2017-06-20 2017-06-22 Mixed reality head mounted display device

Publications (1)

Publication Number Publication Date
US20180365900A1 true US20180365900A1 (en) 2018-12-20

Family

ID=64658023

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/628,560 Abandoned US20180365900A1 (en) 2017-06-20 2017-06-20 Mixed Reality Head Mounted Display Device
US15/630,292 Expired - Fee Related US10401913B2 (en) 2017-06-20 2017-06-22 Mixed reality head mounted display device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/630,292 Expired - Fee Related US10401913B2 (en) 2017-06-20 2017-06-22 Mixed reality head mounted display device

Country Status (2)

Country Link
US (2) US20180365900A1 (en)
CN (1) CN109100865A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109631887A (en) * 2018-12-29 2019-04-16 重庆邮电大学 Inertial navigation high-precision locating method based on binocular, acceleration and gyroscope
CN110320665A (en) * 2019-03-20 2019-10-11 郑州铁路职业技术学院 A kind of VR glasses that adjustable wearing is comfortable
CN110672097A (en) * 2019-11-25 2020-01-10 北京中科深智科技有限公司 Indoor positioning and tracking method, device and system based on laser radar
CN111208646A (en) * 2020-03-04 2020-05-29 杭州光粒科技有限公司 Head Mounted Displays and Wearables
CN115561904A (en) * 2022-10-11 2023-01-03 常州工学院 An adjustable VR display device

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6502989B2 (en) * 2017-03-27 2019-04-17 本田技研工業株式会社 Head mounted display
US11402925B2 (en) * 2019-09-25 2022-08-02 Apple Inc. Crown input and feedback for head-mountable devices
CN111025641B (en) * 2019-12-20 2021-08-03 中国地质大学(武汉) A mixed reality display device based on eye tracker
US11789276B1 (en) * 2020-04-06 2023-10-17 Apple Inc. Head-mounted device with pivoting connectors
US11528953B2 (en) 2020-05-19 2022-12-20 Rockwell Collins, Inc. Display embedded visor helmet mounted display
CN112558306B (en) * 2020-12-23 2023-06-30 Oppo广东移动通信有限公司 Neckworn and Wearables
US20220299792A1 (en) * 2021-03-18 2022-09-22 Meta Platforms Technologies, Llc Lanyard for smart frames and mixed reality devices
EP4418072A4 (en) * 2021-12-08 2025-02-19 Samsung Electronics Co Ltd PORTABLE ELECTRONIC DEVICE WITH IMPLEMENTED DISTRIBUTION SYSTEM FOR CONTENT AND VISION PROCESSING
CN117687221B (en) * 2024-02-04 2024-04-12 中国民用航空飞行学院 VR glasses based on flight simulation inspection uses

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075169A1 (en) * 2010-09-29 2012-03-29 Olympus Corporation Head-mounted display
US20120249741A1 (en) * 2011-03-29 2012-10-04 Giuliano Maciocci Anchoring virtual images to real world surfaces in augmented reality systems
US8352046B1 (en) * 2009-01-30 2013-01-08 Advanced Bionics, Llc Sound processing assembly for use in a cochlear implant system
US20160070110A1 (en) * 2014-04-09 2016-03-10 Alexey Leonidovich Ushakov Composite wearable electronic communication device
US20170068119A1 (en) * 2014-02-19 2017-03-09 Evergaze, Inc. Apparatus and Method for Improving, Augmenting or Enhancing Vision
US20170274282A1 (en) * 2016-03-23 2017-09-28 Intel Corporation Immersive gaming

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100259471A1 (en) * 2007-11-16 2010-10-14 Nikon Corporation Control device, head-mount display device, program, and control method
US20110090135A1 (en) * 2009-10-21 2011-04-21 Symbol Technologies, Inc. Interchangeable display device for a head-mounted display system
KR102218913B1 (en) * 2014-09-04 2021-02-23 엘지전자 주식회사 Smart bracelet
DE202014105859U1 (en) 2014-12-04 2015-04-17 Asia Vital Components Co., Ltd. Cooling structure of Wearable Smart Device
US20170351096A1 (en) * 2016-06-01 2017-12-07 Cheng-Ho Tsai Head-mounted equipment capable of displaying images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352046B1 (en) * 2009-01-30 2013-01-08 Advanced Bionics, Llc Sound processing assembly for use in a cochlear implant system
US20120075169A1 (en) * 2010-09-29 2012-03-29 Olympus Corporation Head-mounted display
US20120249741A1 (en) * 2011-03-29 2012-10-04 Giuliano Maciocci Anchoring virtual images to real world surfaces in augmented reality systems
US20170068119A1 (en) * 2014-02-19 2017-03-09 Evergaze, Inc. Apparatus and Method for Improving, Augmenting or Enhancing Vision
US20160070110A1 (en) * 2014-04-09 2016-03-10 Alexey Leonidovich Ushakov Composite wearable electronic communication device
US20170274282A1 (en) * 2016-03-23 2017-09-28 Intel Corporation Immersive gaming

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109631887A (en) * 2018-12-29 2019-04-16 重庆邮电大学 Inertial navigation high-precision locating method based on binocular, acceleration and gyroscope
CN110320665A (en) * 2019-03-20 2019-10-11 郑州铁路职业技术学院 A kind of VR glasses that adjustable wearing is comfortable
CN110672097A (en) * 2019-11-25 2020-01-10 北京中科深智科技有限公司 Indoor positioning and tracking method, device and system based on laser radar
CN111208646A (en) * 2020-03-04 2020-05-29 杭州光粒科技有限公司 Head Mounted Displays and Wearables
CN115561904A (en) * 2022-10-11 2023-01-03 常州工学院 An adjustable VR display device

Also Published As

Publication number Publication date
US20180364766A1 (en) 2018-12-20
US10401913B2 (en) 2019-09-03
CN109100865A (en) 2018-12-28

Similar Documents

Publication Publication Date Title
US10401913B2 (en) Mixed reality head mounted display device
US11805347B2 (en) Display system having an audio output device
EP3884335B1 (en) Systems and methods for maintaining directional wireless links of motile devices
US11055056B1 (en) Split system for artificial reality
US11366522B1 (en) Systems and methods for providing substantially orthogonal movement of a device about a user's body part
WO2017124724A1 (en) Head-mounted display
CN205620609U (en) Modularization wear -type electronic equipment
CN205656368U (en) Head-mounted virtual reality audio-visual device
US10536666B1 (en) Systems and methods for transmitting aggregated video data
US11990689B2 (en) Antenna system for wearable devices
CN204855941U (en) Wear -type virtual reality equipment and system
US11522841B1 (en) Third-party data manipulation with privacy controls
US11816886B1 (en) Apparatus, system, and method for machine perception
CN205485063U (en) Head display
Hussein Wearable computing: Challenges of implementation and its future
US12034200B1 (en) Integrated camera antenna
US20200251071A1 (en) Wearable Device and Method Therein
US11287885B1 (en) Apparatus, system, and method for determining the position of wearables donned by users of artificial reality systems
CN208013548U (en) Reality enhancing glasses
CN205594225U (en) Novel virtual reality 3D glasses
CN119211512B (en) Augmented reality information notification method and device, electronic equipment and storage medium
US20240184250A1 (en) Antenna systems with an extended ground in a detachable cradle
US20230327328A1 (en) Antenna system for mobile devices
US20240346221A1 (en) Circuits and methods for reducing the effects of variation in inter-die communication in 3d-stacked systems
US20240097311A1 (en) Antenna for wearable electronic devices

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载