US20140198130A1 - Augmented reality user interface with haptic feedback - Google Patents
Augmented reality user interface with haptic feedback Download PDFInfo
- Publication number
- US20140198130A1 US20140198130A1 US13/741,826 US201313741826A US2014198130A1 US 20140198130 A1 US20140198130 A1 US 20140198130A1 US 201313741826 A US201313741826 A US 201313741826A US 2014198130 A1 US2014198130 A1 US 2014198130A1
- Authority
- US
- United States
- Prior art keywords
- augmented reality
- feedback
- control signal
- environment
- event
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 122
- 238000004891 communication Methods 0.000 claims description 109
- 230000003993 interaction Effects 0.000 claims description 51
- 230000000007 visual effect Effects 0.000 claims description 37
- 238000003384 imaging method Methods 0.000 claims description 31
- 230000003287 optical effect Effects 0.000 claims description 11
- 238000012790 confirmation Methods 0.000 claims description 10
- 230000000694 effects Effects 0.000 claims description 6
- 230000009471 action Effects 0.000 claims description 5
- 230000004044 response Effects 0.000 description 34
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 25
- 238000012545 processing Methods 0.000 description 11
- 238000001514 detection method Methods 0.000 description 10
- 238000000034 method Methods 0.000 description 10
- 230000005540 biological transmission Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 238000004590 computer program Methods 0.000 description 4
- 239000002131 composite material Substances 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000000835 fiber Substances 0.000 description 3
- 229920000642 polymer Polymers 0.000 description 3
- 230000005855 radiation Effects 0.000 description 3
- 229910001285 shape-memory alloy Inorganic materials 0.000 description 3
- 239000002520 smart material Substances 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- 210000004905 finger nail Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
Definitions
- the present invention relates to a device that is configured to generate feedback based on an event that occurs in an augmented reality environment.
- Augmented reality devices provide an augmented reality environment in which physical objects in a physical space are concurrently displayed with virtual objects in a virtual space.
- Various augmented reality devices recognize specific codes (e.g., QR codes) disposed on physical objects and display one or more virtual objects in a view that includes the physical objects augmented with the virtual objects based on the specific codes.
- Other augmented reality devices can recognize specific, known physical objects using image recognition such as by transmitting images to a server that performs the image recognition.
- augmented reality systems Despite advances in augmented reality systems, the ability to interact with an augmented virtual environment is limited. For example, conventional augmented reality devices typically use speech recognition for providing input in relation to the augmented reality environment. Providing useful feedback to the user is also limited. In addition, recognizing objects in the virtual reality environment may be computationally intensive and reduce usability in many instances.
- the disclosure relates to a device that is configured to generate feedback based on an event that occurs in an augmented reality environment, provide input to the augmented reality environment, and be recognized in association with the augmented reality environment.
- the augmented reality environment may be generated by an augmented reality device communicably coupled to the device.
- a device may be configured to provide feedback based on an augmented reality environment.
- the device may comprise, for example, a processor configured to receive a control signal from an augmented reality device and a feedback device configured to provide a feedback based on the received control signal.
- the augmented reality device may generate an augmented reality environment and may be remote from the device.
- the control signal received by the device may be representative of an event occurring in the augmented reality environment.
- the augmented reality environment may include a physical space in which at least one physical object exists and an augmented reality space in which one or more virtual objects that augment the physical object are displayed.
- the event may include an interaction between the device and the augmented reality environment, a confirmation of an action occurring with respect to the augmented reality environment, a confirmation that the device is recognized by the augmented reality device, an interaction between the device and one or more virtual objects displayed in the augmented reality space, and/or other occurrence in the augmented reality environment.
- the device may comprise, for example, a communication port, a position or orientation device, an input component, and/or other components.
- the communication port may include an interface through which a communication channel may be maintained with, for example, the augmented reality device.
- the control signal from the augmented reality device may be received via the communication channel, which may include a wired or a wireless communication channel.
- the position or orientation device may be configured to provide the augmented reality device with a position, an orientation, or both, via the communication channel
- the input component may be configured to receive an input such as, for example, a button press, a gesture, and/or other input.
- the input may be communicated, by the processor, to the augmented reality device Via the communication channel.
- the processor of the device may be configured to execute one or more modules, including, for example, a feedback control module, a communication module, and/or other computer program modules.
- the feedback control module may be configured to receive a control signal and cause the feedback device to provide the feedback.
- the communication module may be configured to facilitate communication between the device and the augmented reality device.
- the feedback control module may be configured to receive a control signal and cause the feedback device to provide the feedback.
- the control signal may be representative of an event at the augmented reality device.
- the event may include, for example, one or more virtual objects being displayed in the augmented virtual environment, one or more interactions between the device and the one or more virtual objects, and/or other occurrence related to the augmented reality environment.
- the feedback control module may be configured to provide the control signal to the feedback device.
- the control signal may be directly applied to the feedback device to cause the feedback.
- the feedback control module may be configured to determine a feedback signal based on the received control signal. In these embodiments, the feedback control module may consult a lookup table to determine the feedback signal based on the received control signal.
- the communication module may be configured to facilitate communication between the device and the augmented reality device.
- the communication module may be configured to facilitate communication between the device, the augmented reality device, the server, a handheld device that comprises similar components and functionality as the device, and/or other devices that may be in communication with the device.
- the communication module may be configured to provide a wired or wireless communication channel for communication between the device, the augmented reality device, the handheld device, the server, and/or other device.
- the feedback device may comprise a haptic output device configured to provide haptic feedback in the form of a haptic effect, a visual device configured to provide a visual feedback, an audio device configured to provide an audible feedback, and/or other device that produces feedback.
- the haptic output device may include an actuator, for example, an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys, a macro-composite fiber actuator, an electro-static actuator, an electro-tactile actuator, and/or other type of actuator that provides a physical feedback such as a haptic (e.g., vibrotactile) feedback.
- an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor
- LRA Linear Resonant Actuator
- a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys
- macro-composite fiber actuator such as an electro-static actuator, an electro-tactile actuator, and/or other type of
- the haptic output device may include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
- the visual device may be configured to generate a visual feedback such as visible light at the device. For example, the visual feedback may visually indicate the occurrence of an event in the augmented reality environment.
- the feedback device may be configured to receive one or more signals (e.g., the control signal or the feedback signal) from the feedback control module. Based on the one or more signals, the haptic output device, visual device, audio device, and/or other feedback devices may provide feedback via the device.
- one or more signals e.g., the control signal or the feedback signal
- the haptic output device, visual device, audio device, and/or other feedback devices may provide feedback via the device.
- the device may comprise an identifier device configured to generate identifying indicia.
- the identifying indicia may be used by the augmented reality device to identify the device.
- the identifying indicia may comprise a visible optical signature (e.g., an optical signature that is within visible wavelengths of light) or a non-visible signature (e.g., an optical signature that is not within the visible wavelengths of light).
- an augmented reality symbol may be disposed on a surface of the device. The augmented reality symbol may be used, for example, to determine an orientation of the device within the augmented reality environment, identify the presence of the device in the augmented reality environment, and/or allow other forms of recognition of the device.
- the device may emit an audible signature, an infrared signature, and/or other signature that may be recognized by the augmented reality device,
- the device may be configured as a wearable device such as a ring.
- the feedback device may comprise a light-emitting band disposed about the ring.
- the feedback from the light-emitting band may include a color, a pattern, and/or other visual characteristics that may coordinate with one or more virtual objects in the augmented reality environment.
- the feedback device may comprise one or more haptic output devices spaced apart at the ring.
- the device may include a glove, a thimble, ring, and/or other device that can be worn.
- the feedback device may comprise a light-emitting band disposed at a portion of the device at a fingertip and/or other portion of the device.
- the feedback device may comprise one or more haptic output devices spaced apart throughout the device.
- An identifying indicia and/or augmented reality symbol may be disposed on a surface of the device.
- the device may cover at least a fingertip on a finger of a wearer's hand.
- An identifying indicia or augmented reality symbol may be disposed on a surface of the device covering the fingertip of the wearer and/or other surface of the device.
- a handheld device may comprise the same or similar components and functionality and may interact in a same or similar manner with the augmented reality device as the device.
- the handheld device may comprise, for example, a stylus, a joystick, a mobile phone, a video game controller, and/or other handheld device that may be communicably coupled to the augmented reality device.
- both the device and the handheld device may simultaneously interact with the augmented reality device.
- An augmented reality (“AR”) device may be configured to generate an augmented reality environment comprising both an augmented reality space and a physical space.
- the AR device may comprise, for example, a communication port, an imaging device, a processor, and/or other components.
- the communication port may comprise an interface through which a communication channel may be maintained with, for example, the device.
- An imaging device such as a camera may be configured to image the physical space.
- the imaging device of the augmented reality device may comprise a camera, an infrared detector, and/or other image recording device.
- the processor may be configured to generate the augmented reality space coincident with the physical space.
- the processor may be configured to recognize at least one physical object in the physical space and augment the at least one physical object with one or more virtual objects in the augmented reality space.
- the processor may be configured to determine an event within the augmented reality environment and communicate a control signal representative of that event to the device via the wired or wireless communication channel.
- the control signal may cause feedback to be generated at the device.
- the processor of the AR device may be configured to execute one or more modules, including, for example, an object recognition module, an object generation module, an event handler module, a control signal generation module, a communication module, and/or other computer program modules.
- the objection recognition module may be configured to recognize physical objects in the physical space.
- the object generation module may be configured to generate virtual objects to augment recognized physical objects.
- the event handler module may be configured to detect whether an event occurs in the augmented reality environment.
- the control signal generation module may be configured to receive information relating to an event and generate a control signal for transmission to the device.
- the communication module may be configured to facilitate communication between the augmented reality device and the device.
- a system of providing feedback based on an augmented reality environment may comprise the augmented reality device, the device, and/or the handheld device.
- FIG. 1 illustrates a block diagram of an exemplary system of providing feedback based on an augmented reality environment, according to an implementation of the invention
- FIG. 2A illustrates a schematic view of an exemplary device configured as a wearable device such as a ring, according to various implementations of the invention
- FIG. 2B illustrates a schematic view of an exemplary device configured as a handheld device such as a stylus, according to various implementations of the invention
- FIG. 3 illustrates a schematic view of an exemplary feedback device, according to an implementation of the invention
- FIGS, 5 A, 5 B, and 5 C illustrate schematic views of exemplary augmented reality devices, according to various implementations of the invention
- FIG. 6 illustrates a flowchart of an exemplary process of providing feedback based on an augmented reality environment, according to an implementation of the invention.
- FIG. 7 illustrates a flowchart of an exemplary process of providing feedback based on an augmented reality environment, according to an implementation of the invention.
- FIG. 1 illustrates a block diagram of an exemplary system 10 of providing feedback based on an augmented reality (“AR”) environment.
- the system 10 may comprise a device 100 , an augmented reality (“AR”) device 200 , a communication channel 300 , a server 400 , a handheld device 102 , and/or other devices that may be in communication with the device 100 , the AR device 200 , or the server 400 .
- AR augmented reality
- the device 100 may be configured to provide feedback based on an AR environment.
- the device 100 may be configured to generate feedback based on an event that occurs in an AR environment, provide input to the AR environment, and be recognized in association with the AR environment.
- the AR environment may be generated by the AR device 200 communicably coupled to the device 100 .
- the AR device 200 may generate an AR environment and may be remote from the device 100 .
- the control signal received by the device 100 may be representative of an event occurring in the AR environment.
- the AR environment may include a physical space in which at least one physical object exists and an augmented reality (“AR”) space in which one or more virtual objects that augment the physical object are displayed.
- AR augmented reality
- the AR device 200 may be configured in the shape of an eyeglass.
- the event may include an interaction between the device 100 and the AR environment, a confirmation of an action occurring with respect to the AR environment, a confirmation that the device 100 is recognized by the AR device 200 , an interaction between the device 100 and one or more virtual objects displayed in the AR space, an interaction between a user and the AR environment, and/or other occurrence in the AR environment.
- the device 100 may comprise, for example, a processor 110 configured to receive a control signal from an AR device 200 , a feedback device 120 configured to provide a feedback based on the received control signal, a communication port 130 , a position/orientation device 140 , an input component 150 , an identifier device 160 , and/or other components.
- the device 100 may include a glove, a thimble, ring, and/or other device that can be worn.
- the device 100 may be configured as a handheld device, such as a stylus, a joystick, a mobile phone, a video game controller, and/or other handheld device that may be communicably coupled to the AR device 200 .
- the device 100 and the AR device 200 may be separate devices in a single physical device or integrated in a single physical device.
- the processor 110 of the device 100 may be configured to execute one or more modules, including, for example, a feedback control module 111 , a communication module 112 , and/or other computer program modules of the device 100 .
- the feedback control module 111 may be configured to receive a control signal and cause the feedback device 120 to provide feedback.
- the communication module 112 may be configured to facilitate communication between the device 100 and the AR device 200 .
- the feedback control module 111 may be configured to receive a control signal (e.g., from the AR device 200 ) and cause the feedback device 120 to provide the feedback via the device 100 .
- the control signal may be representative of an event at the AR device 200 .
- the event may include, for example, one or more virtual objects being displayed in the augmented virtual environment, one or more interactions between the device 100 and the one or more virtual objects, and/or other occurrence related to the AR environment.
- the feedback control module 111 may be configured to provide the control signal to the feedback device 120 .
- the control signal may be directly applied to the feedback device 120 to cause the feedback.
- the feedback control module 111 may be configured to determine a feedback response based on the received control signal.
- the feedback control module 111 may consult a lookup table of the device 100 to determine which types of feedback and which feedback signals to include in the feedback response based on the received control signal.
- the feedback response may include a single feedback signal, a plurality of feedback signals for a single feedback device 120 , a plurality of feedback signals for a plurality of feedback devices 120 , a pattern of feedback signals for one or more feedback devices 120 , and/or other types of feedback response.
- the type of feedback response may indicate the type of event represented by the control signal. For example, a feedback response comprising a single signal may indicate that the event represents the recognition of the device 100 in the AR environment. A feedback response comprising a pattern of signals may indicate that the event represents an interaction between the device 100 and the AR environment.
- the indications associated with the different types of feedback responses are not limited to the described examples.
- the lookup table may store associations between a plurality of control signals and a plurality of feedback responses. For example, when a control signal comprises information indicating that an event occurred, the lookup table may store a feedback response associated with that control signal. When a control signal comprises information indicating that a type of event occurred, the lookup table may store one or more different feedback responses for one or more types of event that may be indicated by the information of the control signal. When a control signal comprises information indicating that virtual object(s) were displayed in the augmented virtual environment, the lookup table may store a different feedback response for different virtual objects that may be displayed in the augmented virtual environment.
- the feedback response may coordinate with one or more of the virtual objects indicated in the signal, such that the feedback response corresponds to one or more characteristics of the one or more virtual objects indicated in the signal.
- the feedback may comprise a color, a shape, a pattern, a number of feedback signals, and/or a characteristic that is similar to the virtual objects indicated.
- the lookup table may store a different feedback response for different interactions that may occur between the device 100 and the AR environment.
- the feedback control module 111 may retrieve a feedback response from a server 400 that is configured to store a lookup table comprising a plurality of control signals and associated feedback responses.
- the communication module 112 may be configured to facilitate communication between the device 100 and the AR device 200 .
- the communication module 112 may be configured to facilitate communication between the device 100 , the AR device 200 , the server 400 , the handheld device 102 , which may comprise similar components and functionality as the device 100 , and/or other devices that may be in communication with the device 100 .
- the communication module 112 may be configured to provide a wired or wireless communication channel 300 for communication between the device 100 , the AR device 200 , the handheld device 102 , the server 400 , and/or other device in communication with the device 100 .
- the feedback device 120 may comprise one or more haptic output devices configured to provide haptic feedback the form of a haptic effect, one or more visual devices configured to provide a visual feedback, one or more audio devices configured to provide an audible feedback, and/or other device that produces feedback.
- the haptic output device may include an actuator, for example, an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys, a macro-composite fiber actuator, an electro-static actuator, an electro-tactile actuator, and/or other type of actuator that provides a physical feedback such as a haptic (e.g., vibrotactile) feedback.
- an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor
- LRA Linear Resonant Actuator
- a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys
- macro-composite fiber actuator such as an electro-static actuator, an electro-tactile actuator, and/or other type of
- the haptic output device may include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
- the visual device may be configured to generate a visual feedback such as visible light at the device 100 .
- the visual feedback may visually indicate the occurrence of an event in the AR environment.
- the feedback device 120 may be configured to receive one or more signals (e.g., the control signal or the feedback signal) from the feedback control module 111 . Based on the one or more signals, a haptic output device, visual device, audio device, and/or other feedback devices 120 may provide feedback via the device 100 .
- signals e.g., the control signal or the feedback signal
- a haptic output device, visual device, audio device, and/or other feedback devices 120 may provide feedback via the device 100 .
- the communication port 130 may include an interface through which a communication channel 300 may be maintained with, for example, the AR device 200 .
- the control signal from the AR device 200 may be received via the communication channel 300 , which may include a wired or a wireless communication channel.
- the position/orientation device 140 may be configured to provide the AR device 200 with a position, an orientation, or both, via the communication channel 300 .
- the position/orientation device 140 may comprise a gyroscope, a geospatial positioning device, a compass, and/or other orienting or positioning devices.
- the input component 150 may be configured to receive an input such as, for example, a button press, a gesture, and/or other input.
- the input may be communicated, by the processor 110 , to the AR device 200 via the communication channel 300 .
- the input component 150 may include a touch pad, a touch screen, a mechanical button, a switch, and/or other input component that can receive an input.
- the identifier device 160 may be configured to generate identifying indicia for the device 100 .
- the identifying indicia may be used by the AR device 200 to identify the device 100 .
- the identifying indicia may comprise a visible optical signature (e.g., an optical signature that is within visible wavelengths of light) or a non-visible signature (e.g., an optical signature that is not within the visible wavelengths of light).
- the feedback device 120 may generate the identifying indicia such as by generating an optical signature.
- an augmented reality (“AR”) symbol may be disposed on a surface of the device 100 .
- the AR symbol may be used, for example, to determine an orientation of the device 100 within the AR environment, identify the presence of the device 100 in the AR environment, and/or allow other forms of recognition of the device 100 .
- the device 100 may emit an audible signature, an infrared signature, and/or other signature that may be recognized by the AR device 200 .
- the device 100 may be configured as a wearable device such as a ring 200 .
- the device 100 may include a wearable device such as a glove, a thimble, and/or other device 100 that can be worn.
- the feedback device 120 may comprise one or more devices.
- the one or more devices may be disposed at one or more portions of the device 100 .
- the identifier device 160 may comprise an identifying indicia and/or AR symbol that may be disposed on a surface of the device 100 covering the fingertip of the wearer and/or other surface of the device 100 .
- the identifier device 160 may generate the identifying indicia and/or AR symbol.
- the device 100 may cover at least a fingernail on a finger of a wearer's hand.
- An identifying indicia or AR symbol may be disposed on a surface of the device 100 covering the fingernail of the wearer and/or other surface of the device 100 .
- the device 100 may be configured as the handheld device 102 .
- Handheld device 102 may comprise the same or similar components and functionality and may interact in a same or similar manner with the AR device 200 as the device 100 .
- the handheld device 102 may comprise, for example, a stylus, a joystick, a mobile phone, a video game controller, and/or other handheld device 102 that may be communicably coupled to the AR device 200 .
- both the device 100 and the handheld device 102 may simultaneously interact with the AR device 200 .
- the feedback device 120 of the device 100 may comprise one or more devices.
- the one or more devices may be spaced apart at the device 100 .
- the feedback device 120 may comprise, for example, one or more haptic output devices 122 configured to provide one or more haptic effects, one or more visual devices 124 configured to provide a visual feedback, one or more audio devices 126 configured to provide an audible feedback, a light-emitting band 128 , and/or other device that produces feedback.
- the haptic output device 122 may include an actuator, for example, an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys, a macro-composite fiber, an electro-static actuator, an electro-tactile actuator, and/or other type of actuator that provides a physical feedback such as a haptic (e.g., vibrotactile) feedback.
- an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor
- LRA Linear Resonant Actuator
- a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys, a macro-composite fiber, an electro-static actuator, an electro-tactile actuator, and/or other type of
- the haptic output device may include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
- ESF electrostatic friction
- USF ultrasonic surface friction
- one or more haptic output devices 122 may be spaced apart throughout the device 100 .
- the visual device 124 may be configured to generate a visual feedback such as visible light at the device 100 .
- the visual feedback may visually indicate the occurrence of an event in the AR environment.
- the audio device 126 may be configured to generate audio feedback such as one or more sounds at the device 100 .
- the audio feedback may audibly indicate the occurrence of an event in the AR enviromnent.
- the light-emitting band 128 may be configured to generate a light-emitting band emanating from and/or around the device 100 .
- the light emitted via the band 128 may include a color, a pattern, and/or other visual characteristics.
- the visual characteristics may coordinate with one or more virtual objects in the AR environment.
- an AR device 200 may be configured to generate an AR environment comprising both an AR space and a physical space.
- the AR device 200 may comprise, for example, a processor 210 , an imaging device 200 , a communication port 230 , and/or other components.
- the processor 210 may be configured to generate the AR space coincident with the physical space.
- the processor 210 may be configured to recognize at least one physical object in the physical space and augment the at least one physical object with one or more virtual objects in the AR space.
- the processor 210 may be configured to determine an event within the AR environment and communicate a control signal representative of that event to the device 100 via the wired or wireless communication channel 300 .
- the control signal may cause feedback to be generated at the device 100 .
- the imaging device 220 may be configured to image the physical space.
- the imaging device 220 may comprise one or more cameras, an infrared detector, a video camera, and/or other image recording device.
- the communication port 230 may comprise an interface through which a communication channel 300 may be maintained with, for example, the device 100 .
- the processor 210 may be configured to execute one or more modules, including, for example, an object recognition module 211 , an object generation module 212 , an event handler module 213 , a control signal generation module 214 , a communication module 215 , and/or other computer program modules.
- the object recognition module 211 may be configured to recognize physical objects in the physical space.
- the object generation module 212 may be configured to generate virtual objects to augment recognized physical objects.
- the event handler module 213 may be configured to detect whether an event occurs in the AR environment.
- the control signal generation module 214 may be configured to receive information relating to an event and generate a control signal for transmission to the device 100 .
- the communication module 215 may be configured to facilitate communication between the AR device 200 and the device 100 .
- the object recognition module 211 may be configured to recognize objects in a physical space.
- the object recognition module 211 may communicate with the imaging device 220 and a storage of the AR device 200 to recognize an object in the physical space.
- the object recognition module 211 may receive visual data captured from the imaging device 220 and may process the visual data to determine whether one or more objects exist in the captured visual data.
- the object recognition module 211 may compare the captured objects that exist in the visual data with objects stored in the storage.
- the object recognition module 211 may compare the pixels of a captured object with the pixels of a stored object in the storage according to known techniques. When a threshold percentage of pixels (e.g., 80%, 90%, 100%, and/or other percentages) of the captured object match the pixels of a stored object, the object recognition module 211 may determine that the captured object has been recognized as the stored object. In some implementations, the threshold percentage may depend upon a resolution of the imaging device 220 .
- the object recognition module 211 may obtain information relating to the stored object and transmit the information relating to the stored object and the information relating to the captured object to the object generation module 212 .
- the information transmitted to the object generation module 212 may include, for example, image data for the stored object, a type of the stored object, the location of the captured object in the physical space, a proximity of the captured object to other physical objects, context information relating to the stored object, context information relating to the captured object, and/or other data associated with the stored object or the captured object.
- the object recognition module 211 may transmit the information relating to the stored object and the information relating to the captured object to one or more of the event handler module 213 , the control signal generation module 214 , and/or other modules of the processor 210 .
- the object recognition module 211 may transmit data relating to the captured object to the server 400 such that the server 400 can perform object recognition.
- the server 400 may communicate information relating to a stored object that matches the captured object to the object recognition module 211 .
- the object may transmit the information relating to the stored object from the server 400 and the information relating to the captured object to the object generation module 212 .
- the server 400 may communicate an indication that no match was found.
- the object generation module 212 may receive information relating to a physical object from the object recognition module 211 and may generate one or more virtual objects to augment the physical object in the AR environment.
- the object generation module 212 may access the storage to determine whether one or more virtual objects are associated with the physical object.
- the object generation module 212 may communicate with the server 400 to determine whether a storage of the server 400 has stored one or more associations between the one or more physical objects and one or more virtual objects.
- the server 400 may communicate, to the object generation module 212 , data related to the associated virtual objects.
- FIG. 4 illustrates a block diagram of an exemplary AR environment 500 .
- the AR environment 500 comprises a physical space 520 comprising one or more physical objects 520 a, 520 b, . . . , 520 n and an AR space 510 comprising one or more virtual objects 510 a, 510 b, . . . , 510 n that augment one or more physical objects 520 a, 520 b, . . . , 520 n in the physical space 520 .
- the object generation module 212 may augment a physical object 520 n with one or more virtual objects 510 a, 510 b, . . . , 510 n in the AR space 510 .
- the object generation module 212 may display the AR space 510 (and one or more virtual objects 510 a, 510 b, 510 n ) via a display surface of the AR device 200 .
- the AR space 510 and one or more virtual objects 510 a, 510 b, . . . , 510 n displayed may be displayed in a three-dimensional manner via the display surface of the AR device 200 .
- the AR environment 500 displayed via the display of the AR device 200 may include the physical space 520 and an AR space 510 .
- the physical space 520 may be imaged by the imaging device 220 and displayed via the display.
- the physical space 520 may simply be viewed through the display, such as in embodiments where the display is configured as an at least partially transparent display (e.g., a lens) through which the physical space 520 may be viewed.
- the display is configured as an at least partially transparent display (e.g., a lens) through which the physical space 520 may be viewed.
- the display is configured as an at least partially transparent display (e.g., a lens) through which the physical space 520 may be viewed.
- the display is configured as an at least partially transparent display (e.g., a lens) through which the physical space 520 may be viewed.
- the display is configured as an at least partially transparent display (e.g., a lens) through which the physical space 520 may be viewed.
- a single virtual object 510 a may augment a single physical object 520 a or a plurality of physical objects 520 a, 520 b, . . . 520 n.
- a plurality of virtual objects 510 a, 510 b, 510 n may augment a single physical object 520 a or a plurality of physical objects 520 a, 520 b, 520 n.
- the number and types of virtual objects 510 a, 510 b, . . . 510 n that augment physical objects 520 a, 520 b, . . . 520 n that exist in the physical space 520 is not limited to the examples described.
- the event handler module 213 may be configured to detect whether an event occurs in the AR environment.
- the event handler module 213 may receive data from the imaging device 220 , the object recognition module 211 , the object generation module 212 , the storage, and/or other modules or devices of the AR device 200 .
- the storage of the AR device 200 may store data related to one or more events which the AR device 200 may recognize.
- the storage of the AR device 200 may store data related to events including, for example, an interaction between the device 100 and the AR environment, a confirmation of an action occurring with respect to the AR environment, a confirmation that the device 100 is recognized by the AR device 200 , an interaction between the device 100 and one or more virtual objects displayed in the AR space 510 , a generation of a specific type of virtual object to augment a physical object, a recognition of the device 100 , a recognition of the handheld device 102 , an interaction between a user and the AR environment, and/or other occurrence related to the AR environment.
- data related to events including, for example, an interaction between the device 100 and the AR environment, a confirmation of an action occurring with respect to the AR environment, a confirmation that the device 100 is recognized by the AR device 200 , an interaction between the device 100 and one or more virtual objects displayed in the AR space 510 , a generation of a specific type of virtual object to augment a physical object, a recognition of the device 100 , a recognition of the handheld device 102 , an interaction
- the event handler module 213 may receive visual data from the imaging device 220 , information relating to captured objects in the visual data from the object recognition module 211 , information relating to virtual objects generated by the object generation module 212 , and/or other information related to the AR environment.
- the event handler module 213 may compare the received information to data related to events stored in the storage to determine whether the information (or a portion of the information) is associated with an event.
- the event handler module 213 may transmit event data including the received information and data relating to the associated event to the control signal generation module 214 .
- the event handler module 213 may receive data from the processor indicating that an interaction occurred between the device 100 and the AR environment, one or more virtual objects in the AR environment changed, input was received from the device 100 , input received from the device 100 was processed by the AR device 200 , an interaction occurred between a user and the AR environment, and/or other processing was performed by the AR device 200 .
- the event handler module 213 may compare the data received from the processor 210 with data stored in the storage to determine whether the data is associated with an event. When some or all of received information is associated with an event stored in the storage, the event handler module 213 may transmit event data including the received information and data relating to the associated event to the control signal generation module 214 .
- the event handler module 213 may transmit event data including the received information to the server 400 such that the server 400 can perform event handling.
- the server 400 may communicate information relating to the associated event to the event handler module 213 .
- the event handler module 213 may transmit event data including the received information and data relating to the associated event to the control signal generation module 214 .
- the server 400 may communicate an indication that no match was found.
- the control signal generation module 214 may be configured to receive the event data from the event handler module 213 and generate a control signal based on the event data for transmission to the device 100 .
- the storage of the AR device 200 may include a lookup table that associates a plurality of events and a respective plurality of control signals. Based on the event data received from the event handler module 213 , the control signal generation module 214 may generate a control signal for transmission to the device 100 . For example, the control signal generation module 214 may compare the received event data to the data stored at the storage. When some or all of the event data matches an event stored in the storage, the control signal generation module 214 may generate a control signal related to the control signal associated with the matched event.
- control signal generation module 214 may communicate the event data to the server 400 to determine whether a storage of the server has stored a control signal associated with some or all of the event data.
- the control signal may comprise, for example, information indicating that an event occurred, information indicating that a specific type of event occurred, information indicating one or more virtual objects have been/are displayed in the augmented virtual environment, information indicating one or more interactions between the device 100 and the one or more virtual objects, and/or other information relating to the event in the AR environment.
- the communication module 215 may be configured to facilitate communication between the AR device 200 and the device 100 . In some implementations, the communication module 215 may be configured to facilitate communication between the AR device 200 , the device 100 , the server 400 , the handheld device 102 , and/or other devices that may be in communication with the AR device 200 .
- the communication module 215 may be configured to provide a wired or wireless communication channel 300 for communication between the AR device 200 , the device 100 , and/or the handheld device 102 .
- the communication module 215 may be configured to provide communication between the AR device 200 , the device 100 , the handheld device 102 , the server, and/or other device via the wired or wireless communication channel 300 or via a separate communication channel.
- the communication module 215 may be configured to communicate the control signal generated by the control signal generation module 214 to the device 100 and/or the handheld device 102 via a wired or wireless communication channel 300 .
- the processor 210 of the AR device 200 may be configured to recognize the device 100 when the device 100 is moved within a field of view of the imaging device 220 and/or within the physical space 520 of the AR environment 500 .
- the object recognition module 211 of the AR device 200 may be configured to recognize the device 100 by comparing image data from the imaging device 220 with image data stored in the storage.
- the storage of the AR device 200 may include image data corresponding to the device 100 .
- the storage may include image data corresponding to one or more indicia that may be disposed on the device 100 .
- the indicia may comprise a product code, a QR code, an image associated with the device 100 , and/or other image used to identify the device 100 .
- the processor 210 of the AR device 200 may be configured to recognize an audible signature, an infrared signature, and/or other signature generated by the device 100 .
- the control signal generation module 214 may generate a control signal that may be representative of the recognition of the device 100 such that the feedback generated at the device 100 indicates the recognition.
- the processor 210 may be configured to receive a position of the device 100 and/or an orientation of the device 100 from the device 100 .
- the position and/or orientation of the device 100 may be communicated via the communication channel 300 between the device 100 and the AR device 200 .
- the processor 210 may be configured to determine the position of the device 100 and/or the orientation of the device 100 within the AR enviromnent 500 based on the received position and/or orientation.
- a position indicator image and/or orientation indicator image may be disposed on the device 100 .
- the object recognition module 211 may recognize the position indicator image and/or the orientation indicator image when recognizing that the wearable object 100 is within the view of the imaging device and/or within the physical space 520 of the AR environment 500 .
- the position indicator image and/or the orientation indicator image data may be processed by the object recognition module 211 , the event handler module 213 , and/or other modules of the AR device 200 to determine a position and/or an orientation of the device 100 within the AR enviromnent 500 .
- the processor 210 may be configured to position the device 100 within the AR environment 500 without respect to a distance between a physical object and the device 100 .
- the processor 210 may be configured to receive input from the device 100 .
- the processor 210 may receive data from the device 100 related to input that was received via the input component 150 .
- the input received via the input component 150 may comprise, for example, a button press, a gesture, and/or other input.
- the processor 210 of the AR device 200 may process the received data and perform functionality based on the processing.
- the processor 210 may add, delete, change, and/or otherwise modify one or more virtual objects 510 a, 510 b, . . . , 510 n in the AR environment 500 .
- the processor 210 may send data to the device 100 based on the processing.
- the processor 210 may perform other functionality based on the processing.
- the processor may receive input from the device 100 that includes identifying indicia for the device 100 and an indication that the input comprises the identifying indicia.
- the AR device 200 may store the identifying indicia and associate the identifying indicia with the device 100 .
- the AR device 200 may be configured in the shape of an eyeglass.
- the AR device 200 may be configured to display the AR environment 500 (or AR environments 500 A, 500 B) via one or both lenses 250 of the eyeglass.
- Components of the AR device 200 e.g., the imaging device 220 , wireless transceiver 240 , processor, etc.
- a portion of the frame near one of the lenses 250 A (or a portion of the lens 250 A) may comprise the imaging device 220 .
- a portion of the frame near the other lens 250 B may comprise a wireless transceiver 240 that may comprise a communication port.
- the eyeglass arm 210 (including the portion of the eyeglass frame extending from the lens to the ear) may comprise the processor, the communication port, and/or other components of the AR device 200 .
- the eyeglass arms 210 A, 210 B may comprise one or more of the processor, the communication port, and/or other components of the AR device 200 . Other configurations may be used as well.
- the AR device 200 may be configured as a mobile phone such as a personal digital assistant, smart phone, and/or other mobile phone.
- the imaging device 220 may include one or more cameras (e.g., a front facing camera 220 A, a back facing camera 220 B, and/or other camera of the mobile phone).
- the processor of the mobile phone may comprise the components and functionality of the processor of the AR device 200 .
- the communication components and functionality of the phone e.g., one or more ports, a wireless transceiver, one or more antennas, processing functionality to facilitate communication with other devices, and/or other communication components and functionality
- the display of the mobile device may be configured to display the AR environment 500 .
- the AR device 200 may be configured as a computing device, such as a laptop, desktop computer, tablet, and/or other computing device.
- One or more imaging devices 220 A, 220 B may include a front facing camera, a back facing camera, a webcam communicably coupled to the computing device, and/or other imaging device.
- the processor of the computing device may comprise the components and functionality of the processor of the AR device 200 .
- the communication components and functionality of the computing device e.g., one or more ports, a wireless transceiver, one or more antennas, processing functionality to facilitate communication with other devices, and/or other communication components and functionality
- the display of the computing device may be configured to display the AR environment 500 .
- the AR device 200 may comprise a television, video game system, and/or other device for displaying moving images.
- One or more imaging devices 220 may include a front facing camera, an object sensor, a webcam communicably coupled to the computing device, and/or other imaging device.
- the processor of the device may comprise the components and functionality of the processor of the AR device 200 .
- the communication components and functionality of the device e.g., one or more ports, a wireless transceiver, one or more antennas, processing functionality to facilitate communication with other devices, and/or other communication components and functionality
- the display 250 of the device may be configured to display the AR environment 500 .
- the server 400 may be configured to communicate with one or more of the device 100 , the AR device 200 , the handheld device 102 , and/or other devices in communication with the server 400 .
- server 400 may comprise a processor, a storage, and a communication port.
- the processor of the server 400 may be configured to receive data, recognize objects, handle events, send data, and/or provide other functionality.
- the server 400 may be configured to receive, from the processor 110 of the device 100 , a control signal.
- the storage of the server 400 may comprise a lookup table that may be configured in a manner similar or the same as the lookup table of the device 100 that comprises a plurality of control signals and a plurality of feedback responses.
- the server 400 may communicate information including a feedback response to the feedback control module 111 .
- the server 400 may communicate an indication that no match was found for the control signal to the feedback control module 111 .
- the server 400 may perform image processing and/or object recognition relating to data received from the device 100 .
- the server 400 may receive data related to an object captured by the imaging device 220 of the AR device 200 .
- the processor of the server 400 may perform object recognition related to the captured object.
- the storage of the server 400 may comprise a lookup table that comprises one or more physical objects.
- the server 400 may determine whether the lookup table comprises an entry related to the object recognized from the received data.
- the server 400 may communicate information relating to a stored object that matches the recognized object to the object recognition module 211 .
- the server 400 may communicate an indication that no match was found to the object recognition module 211 ,
- the server 400 may receive data related to a physical object recognized by the object recognition module 211 of the processor 200 of the AR device 200 .
- the processor of the server 400 may determine whether the storage of the server 400 has stored an association between the physical object and one or more virtual objects.
- the storage of the server 400 may comprise a lookup table that comprises physical objects, virtual objects, and one or more correlations between one or more physical object and one or more virtual objects.
- the server 400 may communicate, to the object generation module 212 , data related to the associated virtual objects.
- the server 400 may communicate that no association has been found.
- the server 400 may be configured to receive event data from the event handler module 213 of the processor 200 of the AR device 200 .
- the storage of the server 400 may include a lookup table that associates a plurality of events and a respective plurality of control signals.
- the processor of the server 400 may communicate the event data related to the event to the event handler module 213 .
- the processor of the server 400 may communicate that no match was found to the AR device 200 .
- the communication port of the server 400 may include an interface through which a communication channel 300 may be maintained with, for example, the device 100 , the handheld device 102 , the AR device 200 , and/or other device in communication with the server 400 .
- Data and/or signals may be received via the communication channel 300 , and/or other communication channel through which the server 400 receives data and/or signals.
- FIG. 6 illustrates a flowchart of an exemplary process of providing feedback based on an AR environment 500 , according to an implementation of the invention.
- the described operations of FIG. 6 and other Figures may be accomplished using some or all of the system components described in detail above and, in some implementations, various operations may be performed in different sequences. In other implementations, additional operations may be performed along with some or all of the operations shown in FIG. 6 and the other Figures. In yet other implementations, one or more operations may be performed simultaneously. In yet other implementations, one or more combinations of various operations may be performed. Some implementations may not perform all of the operations described with relation to FIG. 6 and other Figures. Accordingly, the operations described are exemplary in nature and, as such, should not be viewed as limiting.
- the operations of FIG. 6 and other Figures may be implemented in one or more processing devices (e.g., device 100 , AR device 200 , server 400 , handheld device 102 , and/or other devices).
- the one or more processing devices may include one or more devices executing some or all of the operations of FIG. 6 and other Figures in response to instructions stored electronically on an electronic storage medium.
- the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of FIG. 6 and other Figures.
- a communication channel 300 may be established between the device 100 and the AR device 200 .
- an occurrence of an event may be detected in the AR environment 500 .
- One or both of the object recognition module 211 and the object generation module 212 of the processor 210 of the AR device 200 may be configured to facilitate the generation of the AR environment 500 including the AR space 510 coincident with the physical space 520 .
- the event handler module 213 may be configured to determine an event within the AR environment 500 based on information received from one or more of the processor 210 or the imaging device 220 .
- the event may include an interaction between the device 100 and the AR environment 500 , a confirmation of an action occurring with respect to the AR environment 500 , a confirmation that the device 100 is recognized by the AR device 200 , an interaction between the device 100 and one or more virtual objects displayed in the AR space 510 , an interaction between a user and the AR environment, and/or other occurrence in the AR environment 500 .
- a device 100 may receive a control signal from the AR device 200 .
- the control signal generation module 214 may be configured to determine a control signal associated with the detected event and communicate the control signal to the device 100 via the wired or wireless communication channel 300 .
- the feedback control module 111 of the processor 110 may receive the control signal.
- the control signal may be based on the detection of the occurrence of the event in the AR environment 500 .
- feedback may be provided via the device 100 .
- the feedback control module 111 may cause the feedback device 120 to provide feedback via the device 100 .
- the feedback control module 111 may be configured to provide the control signal to the feedback device 120 .
- the control signal may be directly applied to the feedback device 120 to cause the feedback.
- the feedback control module 111 may be configured to determine a feedback response based on the received control signal.
- the feedback control module 111 may provide a feedback response comprising one or more types of feedback and one or more feedback signals of the indicated feedback types to be generated.
- the feedback control module 111 may be configured to generate the indicated feedback signals of the feedback response and transmit the feedback signals to the respective feedback devices 120 to which the signals correspond.
- One or more feedback devices 1 , 20 of the device 100 may provide feedback via the device 100 .
- the feedback provided may be based on one or more feedback signals received from the feedback control module 111 .
- the feedback may correspond to the event detected within the AR environment 500 .
- the feedback may be representative of the event.
- FIG. 7 illustrates a flowchart of an exemplary process of providing feedback based on an AR environment, according to an implementation of the invention.
- a communication channel 300 may be established between the device 100 and the AR device 200 .
- the device 100 may be detected within an AR environment 500 of the AR device 200 .
- the imaging device 220 of the AR device 200 may capture an image in the physical space 520 of the AR environment 500 .
- the object recognition module 211 of the processor 210 of the AR device 200 may detect the device 100 based on image data captured by the imaging device 220 .
- the device 100 may detect an object in the image and may determine whether the object is the device based on the data associated with the detected object.
- the object recognition module 211 may detect the device 100 based on an identifying indicia disposed at the device 100 .
- the processor 210 of the AR device may detect the device 100 based on an identifying indicia of the device 100 .
- the identifying indicia may comprise a visible optical signature (e.g., an optical signature that is within visible wavelengths of light) or a non-visible signature (e.g., an optical signature that is not within the visible wavelengths of light).
- an AR symbol may be disposed on a surface of the device 100 .
- the object recognition module 211 may detect the device 100 based on the AR symbol disposed on the device 100 .
- processor 210 of the AR device may detect the device 100 based on an audible signature, an infrared signature, and/or other signature emitted by the device 100 .
- the device 100 may receive, from the AR device 200 , a control signal that indicates recognition of the device 100 within the AR environment 500 .
- a control signal that indicates recognition of the device 100 within the AR environment 500 .
- data indicating the recognition of the device 100 may be sent to the event handler module 213 .
- the event handler module 213 may compare the data received from the object recognition module 211 with data stored in the storage to determine whether the data is associated with an event.
- the event handler module 213 may transmit that event data including the received information and data relating to the recognition of the device 100 to the control signal generation module 214 .
- the control signal generation module 214 may be configured to receive the event data from the event handler module 213 and generate a control signal based on the event data for transmission to the device 100 .
- the feedback control module 111 of the processor 110 may receive the control signal generated by the control signal generation module 214 .
- the control signal may indicate the detection of the occurrence of the event in the AR environment 500 .
- feedback indicating recognition of the device 100 within the AR environment 500 may be provided via the device 100 .
- the feedback control module 111 may cause the feedback device 120 to provide feedback via the device 100 .
- the feedback control module 111 may be configured to provide the control signal to the feedback device 120 .
- the control signal may be directly applied to the feedback device 120 to cause the feedback.
- the feedback control module 111 may be configured to determine a feedback response based on the received control signal.
- the feedback control module 111 may be configured to generate indicated feedback signals of the feedback response and transmit the feedback signals to the respective feedback devices 120 to which the signals correspond.
- Feedback may be provided via the device 100 .
- one or more feedback devices 120 of the device 100 may provide feedback via the device 100 .
- the feedback provided may be based on one or more feedback signals received from the feedback control module 111 .
- the feedback may correspond to the event detected within the AR environment 500 .
- the feedback may be representative of the recognition of the device 100 within the AR environment 500 .
- an interaction between the device 100 and the AR environment 500 may be detected.
- an interaction may be detected based on input received from the device 100 via the input component 150 .
- An interaction may be detected based on a recognition of the device 100 in the AR environment 500 , a movement of the device 100 in the AR environment 500 , an interaction of the device 100 with a physical object 520 a in the physical space 520 of the AR environment 500 , an interaction of the device 100 with a virtual object 510 a in the virtual space 510 of the AR environment 500 , a feedback provided by the device 100 in the AR environment 500 (e.g., a visual feedback provided in the physical space 510 and/or other feedback recognizable by the AR device 200 ), a recognition of a user in the AR environment 500 , a movement of a user in the AR environment 500 , an interaction of a user with a physical object 520 a in the physical space 520 of the AR environment 500 , an interaction of a user with a virtual object 510 a in the
- the device 100 may receive, from the AR device 200 , a control signal indicating an interaction between the device 100 and the AR environment 500 .
- the AR device 200 including, for example, object recognition module 211 , event handler module 213 , control signal generation module 214 , and/or other modules or devices of the AR device 200
- the AR device 200 may generate a control signal based on the detection of the interaction for transmission to the device 100 .
- the feedback control module 111 of the processor 110 may receive the control signal generated by the AR device 200 .
- the control signal may be based on the detection of the interaction between the device 100 and the AR enviromnent 500 .
- feedback indicating the interaction may be provided via the device 100 .
- the feedback control module 111 may cause the feedback device 120 to provide feedback via the device 100 .
- the feedback control module 111 may be configured to provide the control signal to the feedback device 120 .
- the control signal may be directly applied to the feedback device 120 to cause the feedback.
- the feedback control module 111 may be configured to determine a feedback response based on the received control signal.
- the feedback control module 111 may be configured to generate indicated feedback signals of the feedback response and transmit the feedback signals to the respective feedback devices 120 to which the signals correspond.
- Feedback may be provided via the device 100 .
- one or more feedback devices 120 of the device 100 may provide feedback via the device 100 .
- the feedback provided may be based on one or more feedback signals received from the feedback control module 111 .
- the feedback may correspond to the interaction that was detected between the device 100 and the AR environment 500 .
- the feedback may be representative of the interaction between the device 100 and the AR enviromnent 500 .
- a position and/or orientation of the device 100 within the AR enviromnent 500 may be detected.
- the position/orientation device 140 of the device 100 may be configured to provide the AR device 200 with a position, an orientation, or both, via the communication channel 300 .
- the position/orientation device 140 may comprise a gyroscope, a geospatial positioning device, a compass, and/or other orienting/positioning devices.
- the processor 210 of the AR device 200 may be configured to determine the position of the device 100 and/or the orientation of the device 100 within the AR environment 500 based on the received position and/or orientation.
- a position indicator image and/or orientation indicator image may be disposed on the device 100 .
- the object recognition module 211 may recognize the position indicator image and/or the orientation indicator image when recognizing that the wearable object 100 is within the view of the imaging device and/or within the physical space 510 of the AR environment 500 .
- the position indicator image and/or the orientation indicator image data may be processed by the object recognition module 211 , the event handler module 213 , and/or other modules of the AR device 200 to determine a position and/or an orientation of the device 100 within the AR environment 500 .
- the device 100 may be detected to be within a certain proximity of a virtual object in the AR environment 500 .
- the AR device 200 may determine whether the device 100 is within a certain proximity of a virtual object (e.g., virtual object 510 n ) in the AR environment 500 .
- the device 100 may receive, from the AR device 200 , a control signal indicating one or more characteristics of a virtual object of the AR environment 500 .
- the device 100 may receive the control signal indicating one or more characteristics of the virtual object 510 a.
- the device 100 may receive the control signal based on the detection of the device within a certain proximity of a virtual object 510 a.
- the device 100 may receive the control signal separately from a detection of a position of the device 100 respective to the virtual object 510 a.
- feedback coordinating with the virtual object may be provided via the device 100 .
- the feedback control module 111 of the processor 110 may receive the control signal indicating one or more characteristics of the virtual object 510 a.
- the feedback control module 111 may cause the feedback device 120 to provide feedback via the device 100 based on the one or more characteristics of the virtual object 510 a.
- the feedback control module 111 may be configured to determine a feedback response based on the received control signal.
- the feedback control module 111 may be configured to generate indicated feedback signals of the feedback response and transmit the feedback signals to the respective feedback devices 120 to which the signals correspond.
- the feedback signals may cause the respective feedback devices 120 to provide feedback corresponding to the one or more characteristics of the virtual object 510 a.
- One or more feedback devices 120 of the device 100 may provide feedback via the device 100 that corresponds to the one or more characteristics of the virtual object 510 a.
- the feedback devices 120 may provide feedback of a same color, of a corresponding visual, audio and/or haptic pattern, and/or other corresponding feedback.
- the feedback devices 120 may provide feedback via a light-emitting band at a portion of the device 100 .
- the light-emitting band may include a color, a pattern, and/or other visual characteristics that coordinate with the virtual object 510 a.
- an interaction between a user and the AR environment 500 may be detected.
- an interaction may be detected based on input received from the AR device 200 via the imaging device 220 , the communication port 230 , and/or other device.
- An interaction may be detected based on a recognition of a user in the AR environment 500 , a movement of a user in the AR environment 500 , an interaction of a user with a physical object 520 a in the physical space 520 of the AR environment 500 , an interaction of a user with a virtual object 510 a in the virtual space 510 of the AR environment 500 , a detection of a user within a certain physical proximity of a physical object 520 a in the physical space 520 of the AR environment 500 , a detection of a user within a certain proximity of a virtual object 510 a in the virtual space 510 of the AR environment 500 , and/or another interaction between the user and the AR environment 500 .
- the AR device 200 may only detect interactions between users that have information associated with the device 100 and/or the AR device 200 .
- the AR device 200 may detect an interaction between any user and the AR environment 500 .
- the device 100 may receive, from the AR device 200 , a control signal indicating an interaction between the user and the AR environment 500 .
- the AR device 200 including, for example, object recognition module 211 , event handler module 213 , control signal generation module 214 , and/or other modules or devices of the AR device 200
- the AR device 200 may generate a control signal based on the detection of the interaction for transmission to the device 100 .
- the feedback control module 111 of the processor 110 may receive the control signal generated by the AR device 200 .
- the control signal may be based on the detection of the interaction between the user and the AR environment 500 .
- feedback indicating the interaction may be provided via the device 100 .
- the feedback control module 111 may cause the feedback device 120 to provide feedback via the device 100 .
- the feedback control module 111 may be configured to provide the control signal to the feedback device 120 .
- the control signal may be directly applied to the feedback device 120 to cause the feedback.
- the feedback control module 111 may be configured to determine a feedback response based on the received control signal.
- the feedback control module 111 may be configured to generate indicated feedback signals of the feedback response and transmit the feedback signals to the respective feedback devices 120 to which the signals correspond.
- Feedback may be provided via the device 100 .
- one or more feedback devices 120 of the device 100 may provide feedback via the device 100 .
- the feedback provided may be based on one or more feedback signals received from the feedback control module 111 .
- the feedback may correspond to the interaction that was detected between the device 100 and the AR enviromnent 500 .
- the feedback may be representative of the interaction between the device 100 and the AR environment 500 .
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
- The present invention relates to a device that is configured to generate feedback based on an event that occurs in an augmented reality environment.
- Augmented reality devices provide an augmented reality environment in which physical objects in a physical space are concurrently displayed with virtual objects in a virtual space. Various augmented reality devices recognize specific codes (e.g., QR codes) disposed on physical objects and display one or more virtual objects in a view that includes the physical objects augmented with the virtual objects based on the specific codes. Other augmented reality devices can recognize specific, known physical objects using image recognition such as by transmitting images to a server that performs the image recognition.
- Despite advances in augmented reality systems, the ability to interact with an augmented virtual environment is limited. For example, conventional augmented reality devices typically use speech recognition for providing input in relation to the augmented reality environment. Providing useful feedback to the user is also limited. In addition, recognizing objects in the virtual reality environment may be computationally intensive and reduce usability in many instances.
- The disclosure relates to a device that is configured to generate feedback based on an event that occurs in an augmented reality environment, provide input to the augmented reality environment, and be recognized in association with the augmented reality environment. The augmented reality environment may be generated by an augmented reality device communicably coupled to the device.
- A device may be configured to provide feedback based on an augmented reality environment. The device may comprise, for example, a processor configured to receive a control signal from an augmented reality device and a feedback device configured to provide a feedback based on the received control signal. The augmented reality device may generate an augmented reality environment and may be remote from the device. The control signal received by the device may be representative of an event occurring in the augmented reality environment. The augmented reality environment may include a physical space in which at least one physical object exists and an augmented reality space in which one or more virtual objects that augment the physical object are displayed.
- The event may include an interaction between the device and the augmented reality environment, a confirmation of an action occurring with respect to the augmented reality environment, a confirmation that the device is recognized by the augmented reality device, an interaction between the device and one or more virtual objects displayed in the augmented reality space, and/or other occurrence in the augmented reality environment.
- The device may comprise, for example, a communication port, a position or orientation device, an input component, and/or other components. The communication port may include an interface through which a communication channel may be maintained with, for example, the augmented reality device. The control signal from the augmented reality device may be received via the communication channel, which may include a wired or a wireless communication channel. The position or orientation device may be configured to provide the augmented reality device with a position, an orientation, or both, via the communication channel The input component may be configured to receive an input such as, for example, a button press, a gesture, and/or other input. The input may be communicated, by the processor, to the augmented reality device Via the communication channel.
- In some implementations, the processor of the device may be configured to execute one or more modules, including, for example, a feedback control module, a communication module, and/or other computer program modules. The feedback control module may be configured to receive a control signal and cause the feedback device to provide the feedback. The communication module may be configured to facilitate communication between the device and the augmented reality device.
- The feedback control module may be configured to receive a control signal and cause the feedback device to provide the feedback. The control signal may be representative of an event at the augmented reality device. The event may include, for example, one or more virtual objects being displayed in the augmented virtual environment, one or more interactions between the device and the one or more virtual objects, and/or other occurrence related to the augmented reality environment. In some embodiments, the feedback control module may be configured to provide the control signal to the feedback device. In this embodiment, the control signal may be directly applied to the feedback device to cause the feedback. In some embodiments, the feedback control module may be configured to determine a feedback signal based on the received control signal. In these embodiments, the feedback control module may consult a lookup table to determine the feedback signal based on the received control signal.
- The communication module may be configured to facilitate communication between the device and the augmented reality device. In some implementations, the communication module may be configured to facilitate communication between the device, the augmented reality device, the server, a handheld device that comprises similar components and functionality as the device, and/or other devices that may be in communication with the device. The communication module may be configured to provide a wired or wireless communication channel for communication between the device, the augmented reality device, the handheld device, the server, and/or other device.
- In some implementations, the feedback device may comprise a haptic output device configured to provide haptic feedback in the form of a haptic effect, a visual device configured to provide a visual feedback, an audio device configured to provide an audible feedback, and/or other device that produces feedback. The haptic output device may include an actuator, for example, an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys, a macro-composite fiber actuator, an electro-static actuator, an electro-tactile actuator, and/or other type of actuator that provides a physical feedback such as a haptic (e.g., vibrotactile) feedback. The haptic output device may include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on. The visual device may be configured to generate a visual feedback such as visible light at the device. For example, the visual feedback may visually indicate the occurrence of an event in the augmented reality environment.
- The feedback device may be configured to receive one or more signals (e.g., the control signal or the feedback signal) from the feedback control module. Based on the one or more signals, the haptic output device, visual device, audio device, and/or other feedback devices may provide feedback via the device.
- The device may comprise an identifier device configured to generate identifying indicia. The identifying indicia may be used by the augmented reality device to identify the device. The identifying indicia may comprise a visible optical signature (e.g., an optical signature that is within visible wavelengths of light) or a non-visible signature (e.g., an optical signature that is not within the visible wavelengths of light). In some implementations, an augmented reality symbol may be disposed on a surface of the device. The augmented reality symbol may be used, for example, to determine an orientation of the device within the augmented reality environment, identify the presence of the device in the augmented reality environment, and/or allow other forms of recognition of the device. In some implementations, the device may emit an audible signature, an infrared signature, and/or other signature that may be recognized by the augmented reality device,
- In some implementations, the device may be configured as a wearable device such as a ring. In these implementations, the feedback device may comprise a light-emitting band disposed about the ring. The feedback from the light-emitting band may include a color, a pattern, and/or other visual characteristics that may coordinate with one or more virtual objects in the augmented reality environment. The feedback device may comprise one or more haptic output devices spaced apart at the ring.
- In some implementations, the device may include a glove, a thimble, ring, and/or other device that can be worn. In these implementations, the feedback device may comprise a light-emitting band disposed at a portion of the device at a fingertip and/or other portion of the device. The feedback device may comprise one or more haptic output devices spaced apart throughout the device. An identifying indicia and/or augmented reality symbol may be disposed on a surface of the device. In some implementations, the device may cover at least a fingertip on a finger of a wearer's hand. An identifying indicia or augmented reality symbol may be disposed on a surface of the device covering the fingertip of the wearer and/or other surface of the device.
- In some implementations, a handheld device may comprise the same or similar components and functionality and may interact in a same or similar manner with the augmented reality device as the device. The handheld device may comprise, for example, a stylus, a joystick, a mobile phone, a video game controller, and/or other handheld device that may be communicably coupled to the augmented reality device. In some implementations, both the device and the handheld device may simultaneously interact with the augmented reality device.
- An augmented reality (“AR”) device may be configured to generate an augmented reality environment comprising both an augmented reality space and a physical space. The AR device may comprise, for example, a communication port, an imaging device, a processor, and/or other components. The communication port may comprise an interface through which a communication channel may be maintained with, for example, the device. An imaging device such as a camera may be configured to image the physical space. In some implementations, the imaging device of the augmented reality device may comprise a camera, an infrared detector, and/or other image recording device. The processor may be configured to generate the augmented reality space coincident with the physical space. The processor may be configured to recognize at least one physical object in the physical space and augment the at least one physical object with one or more virtual objects in the augmented reality space. The processor may be configured to determine an event within the augmented reality environment and communicate a control signal representative of that event to the device via the wired or wireless communication channel. The control signal may cause feedback to be generated at the device.
- In some implementations, the processor of the AR device may be configured to execute one or more modules, including, for example, an object recognition module, an object generation module, an event handler module, a control signal generation module, a communication module, and/or other computer program modules. The objection recognition module may be configured to recognize physical objects in the physical space. The object generation module may be configured to generate virtual objects to augment recognized physical objects. The event handler module may be configured to detect whether an event occurs in the augmented reality environment. The control signal generation module may be configured to receive information relating to an event and generate a control signal for transmission to the device. The communication module may be configured to facilitate communication between the augmented reality device and the device.
- In some implementations, a system of providing feedback based on an augmented reality environment may comprise the augmented reality device, the device, and/or the handheld device.
- These and other objects, features, and characteristics of the system and/or method disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
- The components of the following figures are illustrated to emphasize the general principles of the present disclosure and are not necessarily drawn to scale. Reference characters designating corresponding components are repeated as necessary throughout the figures for the sake of consistency and Clarity.
-
FIG. 1 illustrates a block diagram of an exemplary system of providing feedback based on an augmented reality environment, according to an implementation of the invention; -
FIG. 2A illustrates a schematic view of an exemplary device configured as a wearable device such as a ring, according to various implementations of the invention; -
FIG. 2B illustrates a schematic view of an exemplary device configured as a handheld device such as a stylus, according to various implementations of the invention; -
FIG. 3 illustrates a schematic view of an exemplary feedback device, according to an implementation of the invention; -
FIG. 4 illustrates a depiction of an exemplary augmented reality environment, according to implementations of the invention; - FIGS, 5A, 5B, and 5C illustrate schematic views of exemplary augmented reality devices, according to various implementations of the invention;
-
FIG. 6 illustrates a flowchart of an exemplary process of providing feedback based on an augmented reality environment, according to an implementation of the invention; and -
FIG. 7 illustrates a flowchart of an exemplary process of providing feedback based on an augmented reality environment, according to an implementation of the invention. -
FIG. 1 illustrates a block diagram of anexemplary system 10 of providing feedback based on an augmented reality (“AR”) environment. Thesystem 10 may comprise adevice 100, an augmented reality (“AR”)device 200, acommunication channel 300, aserver 400, ahandheld device 102, and/or other devices that may be in communication with thedevice 100, theAR device 200, or theserver 400. - The
device 100 may be configured to provide feedback based on an AR environment. For example, thedevice 100 may be configured to generate feedback based on an event that occurs in an AR environment, provide input to the AR environment, and be recognized in association with the AR environment. The AR environment may be generated by theAR device 200 communicably coupled to thedevice 100. - The
AR device 200 may generate an AR environment and may be remote from thedevice 100. The control signal received by thedevice 100 may be representative of an event occurring in the AR environment. The AR environment may include a physical space in which at least one physical object exists and an augmented reality (“AR”) space in which one or more virtual objects that augment the physical object are displayed. As discussed in further detail below, in some implementations, theAR device 200 may be configured in the shape of an eyeglass. - The event may include an interaction between the
device 100 and the AR environment, a confirmation of an action occurring with respect to the AR environment, a confirmation that thedevice 100 is recognized by theAR device 200, an interaction between thedevice 100 and one or more virtual objects displayed in the AR space, an interaction between a user and the AR environment, and/or other occurrence in the AR environment. - In some implementations, the
device 100 may comprise, for example, aprocessor 110 configured to receive a control signal from anAR device 200, afeedback device 120 configured to provide a feedback based on the received control signal, acommunication port 130, a position/orientation device 140, aninput component 150, anidentifier device 160, and/or other components. As discussed in further detail below, in some implementations, thedevice 100 may include a glove, a thimble, ring, and/or other device that can be worn. In some implementations, thedevice 100 may be configured as a handheld device, such as a stylus, a joystick, a mobile phone, a video game controller, and/or other handheld device that may be communicably coupled to theAR device 200. In some implementations, thedevice 100 and theAR device 200 may be separate devices in a single physical device or integrated in a single physical device. - In some implementations, the
processor 110 of thedevice 100 may be configured to execute one or more modules, including, for example, afeedback control module 111, acommunication module 112, and/or other computer program modules of thedevice 100. Thefeedback control module 111 may be configured to receive a control signal and cause thefeedback device 120 to provide feedback. Thecommunication module 112 may be configured to facilitate communication between thedevice 100 and theAR device 200. - The
feedback control module 111 may be configured to receive a control signal (e.g., from the AR device 200) and cause thefeedback device 120 to provide the feedback via thedevice 100. The control signal may be representative of an event at theAR device 200. The event may include, for example, one or more virtual objects being displayed in the augmented virtual environment, one or more interactions between thedevice 100 and the one or more virtual objects, and/or other occurrence related to the AR environment. In some embodiments, thefeedback control module 111 may be configured to provide the control signal to thefeedback device 120. In this embodiment, the control signal may be directly applied to thefeedback device 120 to cause the feedback. In some embodiments, thefeedback control module 111 may be configured to determine a feedback response based on the received control signal. In these embodiments, thefeedback control module 111 may consult a lookup table to determine the feedback response based on the received control signal. The feedback response may comprise one or more types of feedback and one or more feedback signals of the indicated feedback types to be generated based on the received control signal. Thefeedback control module 111 may be configured to generate the indicated feedback signals of the feedback response and transmit the feedback signals to therespective feedback devices 120 to which the signals correspond. - In some implementations, the
feedback control module 111 may consult a lookup table of thedevice 100 to determine which types of feedback and which feedback signals to include in the feedback response based on the received control signal. The feedback response may include a single feedback signal, a plurality of feedback signals for asingle feedback device 120, a plurality of feedback signals for a plurality offeedback devices 120, a pattern of feedback signals for one ormore feedback devices 120, and/or other types of feedback response. In some implementations, the type of feedback response may indicate the type of event represented by the control signal. For example, a feedback response comprising a single signal may indicate that the event represents the recognition of thedevice 100 in the AR environment. A feedback response comprising a pattern of signals may indicate that the event represents an interaction between thedevice 100 and the AR environment. The indications associated with the different types of feedback responses are not limited to the described examples. - In some implementations, the lookup table may store associations between a plurality of control signals and a plurality of feedback responses. For example, when a control signal comprises information indicating that an event occurred, the lookup table may store a feedback response associated with that control signal. When a control signal comprises information indicating that a type of event occurred, the lookup table may store one or more different feedback responses for one or more types of event that may be indicated by the information of the control signal. When a control signal comprises information indicating that virtual object(s) were displayed in the augmented virtual environment, the lookup table may store a different feedback response for different virtual objects that may be displayed in the augmented virtual environment. For example, the feedback response may coordinate with one or more of the virtual objects indicated in the signal, such that the feedback response corresponds to one or more characteristics of the one or more virtual objects indicated in the signal. The feedback may comprise a color, a shape, a pattern, a number of feedback signals, and/or a characteristic that is similar to the virtual objects indicated. When a control signal comprises information indicating an interaction between the
device 100 and one or more virtual objects, the lookup table may store a different feedback response for different interactions that may occur between thedevice 100 and the AR environment. In some implementations, thefeedback control module 111 may retrieve a feedback response from aserver 400 that is configured to store a lookup table comprising a plurality of control signals and associated feedback responses. - The
communication module 112 may be configured to facilitate communication between thedevice 100 and theAR device 200. In some implementations, thecommunication module 112 may be configured to facilitate communication between thedevice 100, theAR device 200, theserver 400, thehandheld device 102, which may comprise similar components and functionality as thedevice 100, and/or other devices that may be in communication with thedevice 100. Thecommunication module 112 may be configured to provide a wired orwireless communication channel 300 for communication between thedevice 100, theAR device 200, thehandheld device 102, theserver 400, and/or other device in communication with thedevice 100. - The
feedback device 120 may comprise one or more haptic output devices configured to provide haptic feedback the form of a haptic effect, one or more visual devices configured to provide a visual feedback, one or more audio devices configured to provide an audible feedback, and/or other device that produces feedback. The haptic output device may include an actuator, for example, an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys, a macro-composite fiber actuator, an electro-static actuator, an electro-tactile actuator, and/or other type of actuator that provides a physical feedback such as a haptic (e.g., vibrotactile) feedback. The haptic output device may include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on. The visual device may be configured to generate a visual feedback such as visible light at thedevice 100. For example, the visual feedback may visually indicate the occurrence of an event in the AR environment. - The
feedback device 120 may be configured to receive one or more signals (e.g., the control signal or the feedback signal) from thefeedback control module 111. Based on the one or more signals, a haptic output device, visual device, audio device, and/orother feedback devices 120 may provide feedback via thedevice 100. - The
communication port 130 may include an interface through which acommunication channel 300 may be maintained with, for example, theAR device 200. The control signal from theAR device 200 may be received via thecommunication channel 300, which may include a wired or a wireless communication channel. - The position/
orientation device 140 may be configured to provide theAR device 200 with a position, an orientation, or both, via thecommunication channel 300. For example, the position/orientation device 140 may comprise a gyroscope, a geospatial positioning device, a compass, and/or other orienting or positioning devices. - The
input component 150 may be configured to receive an input such as, for example, a button press, a gesture, and/or other input. The input may be communicated, by theprocessor 110, to theAR device 200 via thecommunication channel 300. For example, theinput component 150 may include a touch pad, a touch screen, a mechanical button, a switch, and/or other input component that can receive an input. - The
identifier device 160 may be configured to generate identifying indicia for thedevice 100. The identifying indicia may be used by theAR device 200 to identify thedevice 100. The identifying indicia may comprise a visible optical signature (e.g., an optical signature that is within visible wavelengths of light) or a non-visible signature (e.g., an optical signature that is not within the visible wavelengths of light). In some implementations, thefeedback device 120 may generate the identifying indicia such as by generating an optical signature. - In some implementations, an augmented reality (“AR”) symbol may be disposed on a surface of the
device 100. The AR symbol may be used, for example, to determine an orientation of thedevice 100 within the AR environment, identify the presence of thedevice 100 in the AR environment, and/or allow other forms of recognition of thedevice 100. In some implementations, thedevice 100 may emit an audible signature, an infrared signature, and/or other signature that may be recognized by theAR device 200. - In some implementations, and as shown in
FIG. 2A , thedevice 100 may be configured as a wearable device such as aring 200. In some implementations, thedevice 100 may include a wearable device such as a glove, a thimble, and/orother device 100 that can be worn. Thefeedback device 120 may comprise one or more devices. The one or more devices may be disposed at one or more portions of thedevice 100. Theidentifier device 160 may comprise an identifying indicia and/or AR symbol that may be disposed on a surface of thedevice 100 covering the fingertip of the wearer and/or other surface of thedevice 100. In some implementations, theidentifier device 160 may generate the identifying indicia and/or AR symbol. In some implementations, thedevice 100 may cover at least a fingernail on a finger of a wearer's hand. An identifying indicia or AR symbol may be disposed on a surface of thedevice 100 covering the fingernail of the wearer and/or other surface of thedevice 100. - As shown in
FIG. 2B , in some implementations, thedevice 100 may be configured as thehandheld device 102.Handheld device 102 may comprise the same or similar components and functionality and may interact in a same or similar manner with theAR device 200 as thedevice 100. Thehandheld device 102 may comprise, for example, a stylus, a joystick, a mobile phone, a video game controller, and/or otherhandheld device 102 that may be communicably coupled to theAR device 200. In some implementations, both thedevice 100 and thehandheld device 102 may simultaneously interact with theAR device 200. - As shown in
FIG. 3 , thefeedback device 120 of thedevice 100 may comprise one or more devices. In some implementations, the one or more devices may be spaced apart at thedevice 100. Thefeedback device 120 may comprise, for example, one or morehaptic output devices 122 configured to provide one or more haptic effects, one or morevisual devices 124 configured to provide a visual feedback, one or moreaudio devices 126 configured to provide an audible feedback, a light-emittingband 128, and/or other device that produces feedback. - The
haptic output device 122 may include an actuator, for example, an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys, a macro-composite fiber, an electro-static actuator, an electro-tactile actuator, and/or other type of actuator that provides a physical feedback such as a haptic (e.g., vibrotactile) feedback. The haptic output device may include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on. In some implementations, one or morehaptic output devices 122 may be spaced apart throughout thedevice 100. - The
visual device 124 may be configured to generate a visual feedback such as visible light at thedevice 100. For example, the visual feedback may visually indicate the occurrence of an event in the AR environment. - The
audio device 126 may be configured to generate audio feedback such as one or more sounds at thedevice 100. For example, the audio feedback may audibly indicate the occurrence of an event in the AR enviromnent. - The light-emitting
band 128 may be configured to generate a light-emitting band emanating from and/or around thedevice 100. The light emitted via theband 128 may include a color, a pattern, and/or other visual characteristics. The visual characteristics may coordinate with one or more virtual objects in the AR environment. - Returning back to
FIG. 1 , anAR device 200 may be configured to generate an AR environment comprising both an AR space and a physical space. TheAR device 200 may comprise, for example, aprocessor 210, animaging device 200, acommunication port 230, and/or other components. Theprocessor 210 may be configured to generate the AR space coincident with the physical space. Theprocessor 210 may be configured to recognize at least one physical object in the physical space and augment the at least one physical object with one or more virtual objects in the AR space. Theprocessor 210 may be configured to determine an event within the AR environment and communicate a control signal representative of that event to thedevice 100 via the wired orwireless communication channel 300. The control signal may cause feedback to be generated at thedevice 100. Theimaging device 220 may be configured to image the physical space. In some implementations, theimaging device 220 may comprise one or more cameras, an infrared detector, a video camera, and/or other image recording device. Thecommunication port 230 may comprise an interface through which acommunication channel 300 may be maintained with, for example, thedevice 100. - In some implementations, the
processor 210 may be configured to execute one or more modules, including, for example, anobject recognition module 211, anobject generation module 212, anevent handler module 213, a controlsignal generation module 214, acommunication module 215, and/or other computer program modules. Theobject recognition module 211 may be configured to recognize physical objects in the physical space. Theobject generation module 212 may be configured to generate virtual objects to augment recognized physical objects. Theevent handler module 213 may be configured to detect whether an event occurs in the AR environment. The controlsignal generation module 214 may be configured to receive information relating to an event and generate a control signal for transmission to thedevice 100. Thecommunication module 215 may be configured to facilitate communication between theAR device 200 and thedevice 100. - In some implementations, the
object recognition module 211 may be configured to recognize objects in a physical space. Theobject recognition module 211 may communicate with theimaging device 220 and a storage of theAR device 200 to recognize an object in the physical space. For example, theobject recognition module 211 may receive visual data captured from theimaging device 220 and may process the visual data to determine whether one or more objects exist in the captured visual data. Theobject recognition module 211 may compare the captured objects that exist in the visual data with objects stored in the storage. - For example, the
object recognition module 211 may compare the pixels of a captured object with the pixels of a stored object in the storage according to known techniques. When a threshold percentage of pixels (e.g., 80%, 90%, 100%, and/or other percentages) of the captured object match the pixels of a stored object, theobject recognition module 211 may determine that the captured object has been recognized as the stored object. In some implementations, the threshold percentage may depend upon a resolution of theimaging device 220. - The
object recognition module 211 may obtain information relating to the stored object and transmit the information relating to the stored object and the information relating to the captured object to theobject generation module 212. The information transmitted to theobject generation module 212 may include, for example, image data for the stored object, a type of the stored object, the location of the captured object in the physical space, a proximity of the captured object to other physical objects, context information relating to the stored object, context information relating to the captured object, and/or other data associated with the stored object or the captured object. In some implementations, theobject recognition module 211 may transmit the information relating to the stored object and the information relating to the captured object to one or more of theevent handler module 213, the controlsignal generation module 214, and/or other modules of theprocessor 210. - In some implementations, when the captured object does not match a stored object, the
object recognition module 211 may transmit data relating to the captured object to theserver 400 such that theserver 400 can perform object recognition. When theserver 400 recognizes the captured object, theserver 400 may communicate information relating to a stored object that matches the captured object to theobject recognition module 211. The object may transmit the information relating to the stored object from theserver 400 and the information relating to the captured object to theobject generation module 212. When theserver 400 does not recognize the captured object, theserver 400 may communicate an indication that no match was found. - In some implementations, the
object generation module 212 may receive information relating to a physical object from theobject recognition module 211 and may generate one or more virtual objects to augment the physical object in the AR environment. Theobject generation module 212 may access the storage to determine whether one or more virtual objects are associated with the physical object. When no virtual objects are associated with the physical object, theobject generation module 212 may communicate with theserver 400 to determine whether a storage of theserver 400 has stored one or more associations between the one or more physical objects and one or more virtual objects. When an association is found in the storage of theserver 400, theserver 400 may communicate, to theobject generation module 212, data related to the associated virtual objects. - When a virtual object is associated with a physical object identified in the information received from the
object recognition module 211, theobject generation module 212 may generate an AR space coincident with the physical space.FIG. 4 illustrates a block diagram of anexemplary AR environment 500. TheAR environment 500 comprises aphysical space 520 comprising one or morephysical objects AR space 510 comprising one or morevirtual objects physical objects physical space 520. - In some implementations, the
object generation module 212 may augment aphysical object 520 n with one or morevirtual objects AR space 510. For example, theobject generation module 212 may display the AR space 510 (and one or morevirtual objects AR device 200. In some implementations, theAR space 510 and one or morevirtual objects AR device 200. - The
AR environment 500 displayed via the display of theAR device 200 may include thephysical space 520 and anAR space 510. In some embodiments, thephysical space 520 may be imaged by theimaging device 220 and displayed via the display. In some embodiments, thephysical space 520 may simply be viewed through the display, such as in embodiments where the display is configured as an at least partially transparent display (e.g., a lens) through which thephysical space 520 may be viewed. Whichever embodiment to display thephysical space 520 is used, one or morevirtual objects physical objects physical space 520, thereby augmenting the one or morephysical objects AR environment 500. A singlevirtual object 510 a may augment a singlephysical object 520 a or a plurality ofphysical objects virtual objects physical object 520 a or a plurality ofphysical objects virtual objects physical objects physical space 520 is not limited to the examples described. - In some implementations, the
event handler module 213 may be configured to detect whether an event occurs in the AR environment. Theevent handler module 213 may receive data from theimaging device 220, theobject recognition module 211, theobject generation module 212, the storage, and/or other modules or devices of theAR device 200. The storage of theAR device 200 may store data related to one or more events which theAR device 200 may recognize. For example, the storage of theAR device 200 may store data related to events including, for example, an interaction between thedevice 100 and the AR environment, a confirmation of an action occurring with respect to the AR environment, a confirmation that thedevice 100 is recognized by theAR device 200, an interaction between thedevice 100 and one or more virtual objects displayed in theAR space 510, a generation of a specific type of virtual object to augment a physical object, a recognition of thedevice 100, a recognition of thehandheld device 102, an interaction between a user and the AR environment, and/or other occurrence related to the AR environment. - In some implementations, the
event handler module 213 may receive visual data from theimaging device 220, information relating to captured objects in the visual data from theobject recognition module 211, information relating to virtual objects generated by theobject generation module 212, and/or other information related to the AR environment. Theevent handler module 213 may compare the received information to data related to events stored in the storage to determine whether the information (or a portion of the information) is associated with an event. When the received information is associated with an event, theevent handler module 213 may transmit event data including the received information and data relating to the associated event to the controlsignal generation module 214. - In some implementations, the
event handler module 213 may receive data from the processor indicating that an interaction occurred between thedevice 100 and the AR environment, one or more virtual objects in the AR environment changed, input was received from thedevice 100, input received from thedevice 100 was processed by theAR device 200, an interaction occurred between a user and the AR environment, and/or other processing was performed by theAR device 200. In some implementations, theevent handler module 213 may compare the data received from theprocessor 210 with data stored in the storage to determine whether the data is associated with an event. When some or all of received information is associated with an event stored in the storage, theevent handler module 213 may transmit event data including the received information and data relating to the associated event to the controlsignal generation module 214. - In some implementations, when the received information is not associated with an event stored in the storage, the
event handler module 213 may transmit event data including the received information to theserver 400 such that theserver 400 can perform event handling. When some or all of received information is associated with an event stored in the storage of theserver 400, theserver 400 may communicate information relating to the associated event to theevent handler module 213. Theevent handler module 213 may transmit event data including the received information and data relating to the associated event to the controlsignal generation module 214. When the received information is not associated with an event stored in the storage of theserver 400, theserver 400 may communicate an indication that no match was found. - In some implementations, the control
signal generation module 214 may be configured to receive the event data from theevent handler module 213 and generate a control signal based on the event data for transmission to thedevice 100. The storage of theAR device 200 may include a lookup table that associates a plurality of events and a respective plurality of control signals. Based on the event data received from theevent handler module 213, the controlsignal generation module 214 may generate a control signal for transmission to thedevice 100. For example, the controlsignal generation module 214 may compare the received event data to the data stored at the storage. When some or all of the event data matches an event stored in the storage, the controlsignal generation module 214 may generate a control signal related to the control signal associated with the matched event. When the event data does not match an event stored in the storage, the controlsignal generation module 214 may communicate the event data to theserver 400 to determine whether a storage of the server has stored a control signal associated with some or all of the event data. The control signal may comprise, for example, information indicating that an event occurred, information indicating that a specific type of event occurred, information indicating one or more virtual objects have been/are displayed in the augmented virtual environment, information indicating one or more interactions between thedevice 100 and the one or more virtual objects, and/or other information relating to the event in the AR environment. - The
communication module 215 may be configured to facilitate communication between theAR device 200 and thedevice 100. In some implementations, thecommunication module 215 may be configured to facilitate communication between theAR device 200, thedevice 100, theserver 400, thehandheld device 102, and/or other devices that may be in communication with theAR device 200. Thecommunication module 215 may be configured to provide a wired orwireless communication channel 300 for communication between theAR device 200, thedevice 100, and/or thehandheld device 102. Thecommunication module 215 may be configured to provide communication between theAR device 200, thedevice 100, thehandheld device 102, the server, and/or other device via the wired orwireless communication channel 300 or via a separate communication channel. Thecommunication module 215 may be configured to communicate the control signal generated by the controlsignal generation module 214 to thedevice 100 and/or thehandheld device 102 via a wired orwireless communication channel 300. - In some implementations, the
processor 210 of theAR device 200 may be configured to recognize thedevice 100 when thedevice 100 is moved within a field of view of theimaging device 220 and/or within thephysical space 520 of theAR environment 500. For example, theobject recognition module 211 of theAR device 200 may be configured to recognize thedevice 100 by comparing image data from theimaging device 220 with image data stored in the storage. The storage of theAR device 200 may include image data corresponding to thedevice 100. The storage may include image data corresponding to one or more indicia that may be disposed on thedevice 100. The indicia may comprise a product code, a QR code, an image associated with thedevice 100, and/or other image used to identify thedevice 100. Theprocessor 210 of theAR device 200 may be configured to recognize an audible signature, an infrared signature, and/or other signature generated by thedevice 100. In these implementations, the controlsignal generation module 214 may generate a control signal that may be representative of the recognition of thedevice 100 such that the feedback generated at thedevice 100 indicates the recognition. - In some implementations, the
processor 210 may be configured to receive a position of thedevice 100 and/or an orientation of thedevice 100 from thedevice 100. The position and/or orientation of thedevice 100 may be communicated via thecommunication channel 300 between thedevice 100 and theAR device 200. Theprocessor 210 may be configured to determine the position of thedevice 100 and/or the orientation of thedevice 100 within theAR enviromnent 500 based on the received position and/or orientation. In some implementations, a position indicator image and/or orientation indicator image may be disposed on thedevice 100. Theobject recognition module 211 may recognize the position indicator image and/or the orientation indicator image when recognizing that thewearable object 100 is within the view of the imaging device and/or within thephysical space 520 of theAR environment 500. The position indicator image and/or the orientation indicator image data may be processed by theobject recognition module 211, theevent handler module 213, and/or other modules of theAR device 200 to determine a position and/or an orientation of thedevice 100 within theAR enviromnent 500. In some implementations, theprocessor 210 may be configured to position thedevice 100 within theAR environment 500 without respect to a distance between a physical object and thedevice 100. - The
processor 210 may be configured to receive input from thedevice 100. For example, theprocessor 210 may receive data from thedevice 100 related to input that was received via theinput component 150. The input received via theinput component 150 may comprise, for example, a button press, a gesture, and/or other input. Theprocessor 210 of theAR device 200 may process the received data and perform functionality based on the processing. For example, theprocessor 210 may add, delete, change, and/or otherwise modify one or morevirtual objects AR environment 500. Theprocessor 210 may send data to thedevice 100 based on the processing. Theprocessor 210 may perform other functionality based on the processing. In some implementations, the processor may receive input from thedevice 100 that includes identifying indicia for thedevice 100 and an indication that the input comprises the identifying indicia. TheAR device 200 may store the identifying indicia and associate the identifying indicia with thedevice 100. - As shown in
FIG. 5A , in some implementations, theAR device 200 may be configured in the shape of an eyeglass. For example, theAR device 200 may be configured to display the AR environment 500 (orAR environments lenses 250 of the eyeglass. Components of the AR device 200 (e.g., theimaging device 220,wireless transceiver 240, processor, etc.) may be disposed at various locations of the eyeglass. The following are merely for illustrative purposes and are non-limiting examples. A portion of the frame near one of thelenses 250A (or a portion of thelens 250A) may comprise theimaging device 220. A portion of the frame near theother lens 250B (or a portion of theother lens 250B) may comprise awireless transceiver 240 that may comprise a communication port. The eyeglass arm 210 (including the portion of the eyeglass frame extending from the lens to the ear) may comprise the processor, the communication port, and/or other components of theAR device 200. In some implementations, theeyeglass arms AR device 200. Other configurations may be used as well. - As shown in
FIG. 5B , in some implementations, theAR device 200 may be configured as a mobile phone such as a personal digital assistant, smart phone, and/or other mobile phone. Theimaging device 220 may include one or more cameras (e.g., afront facing camera 220A, aback facing camera 220B, and/or other camera of the mobile phone). The processor of the mobile phone may comprise the components and functionality of the processor of theAR device 200. The communication components and functionality of the phone (e.g., one or more ports, a wireless transceiver, one or more antennas, processing functionality to facilitate communication with other devices, and/or other communication components and functionality) may comprise the communication port and communication module of theAR device 200. The display of the mobile device may be configured to display theAR environment 500. - In some implementations, the
AR device 200 may be configured as a computing device, such as a laptop, desktop computer, tablet, and/or other computing device. One ormore imaging devices AR device 200. The communication components and functionality of the computing device (e.g., one or more ports, a wireless transceiver, one or more antennas, processing functionality to facilitate communication with other devices, and/or other communication components and functionality) may comprise the communication port and communication module of theAR device 200. The display of the computing device may be configured to display theAR environment 500. - As shown in
FIG. 5C , in some implementations, theAR device 200 may comprise a television, video game system, and/or other device for displaying moving images. One ormore imaging devices 220 may include a front facing camera, an object sensor, a webcam communicably coupled to the computing device, and/or other imaging device. The processor of the device may comprise the components and functionality of the processor of theAR device 200. The communication components and functionality of the device (e.g., one or more ports, a wireless transceiver, one or more antennas, processing functionality to facilitate communication with other devices, and/or other communication components and functionality) may comprise the communication port and communication module of theAR device 200. Thedisplay 250 of the device may be configured to display theAR environment 500. - Referring back to
FIG. 1 , in some implementations, theserver 400 may be configured to communicate with one or more of thedevice 100, theAR device 200, thehandheld device 102, and/or other devices in communication with theserver 400. In some implementations,server 400 may comprise a processor, a storage, and a communication port. - The processor of the
server 400 may be configured to receive data, recognize objects, handle events, send data, and/or provide other functionality. In some implementations, theserver 400 may be configured to receive, from theprocessor 110 of thedevice 100, a control signal. The storage of theserver 400 may comprise a lookup table that may be configured in a manner similar or the same as the lookup table of thedevice 100 that comprises a plurality of control signals and a plurality of feedback responses. When the lookup table includes an entry relating to the control signal, theserver 400 may communicate information including a feedback response to thefeedback control module 111. When the lookup table of theserver 400 does not include an entry relating to the control signal, theserver 400 may communicate an indication that no match was found for the control signal to thefeedback control module 111. In some implementations, theserver 400 may perform image processing and/or object recognition relating to data received from thedevice 100. - In some implementations, the
server 400 may receive data related to an object captured by theimaging device 220 of theAR device 200. The processor of theserver 400 may perform object recognition related to the captured object. The storage of theserver 400 may comprise a lookup table that comprises one or more physical objects. Theserver 400 may determine whether the lookup table comprises an entry related to the object recognized from the received data. When the lookup table comprises an entry related to the recognized object, theserver 400 may communicate information relating to a stored object that matches the recognized object to theobject recognition module 211. When theserver 400 does not recognize the recognized object, theserver 400 may communicate an indication that no match was found to theobject recognition module 211, - In some implementations, the
server 400 may receive data related to a physical object recognized by theobject recognition module 211 of theprocessor 200 of theAR device 200. The processor of theserver 400 may determine whether the storage of theserver 400 has stored an association between the physical object and one or more virtual objects. In some implementations, the storage of theserver 400 may comprise a lookup table that comprises physical objects, virtual objects, and one or more correlations between one or more physical object and one or more virtual objects. When an association is found in the storage of theserver 400, theserver 400 may communicate, to theobject generation module 212, data related to the associated virtual objects. When no association is found in the storage of theserver 400, theserver 400 may communicate that no association has been found. - In some implementations, the
server 400 may be configured to receive event data from theevent handler module 213 of theprocessor 200 of theAR device 200. The storage of theserver 400 may include a lookup table that associates a plurality of events and a respective plurality of control signals. When some or all of the event data matches an event stored in the storage of theserver 400, the processor of theserver 400 may communicate the event data related to the event to theevent handler module 213. When the event data does not match an event stored in the storage, the processor of theserver 400 may communicate that no match was found to theAR device 200. - The communication port of the
server 400 may include an interface through which acommunication channel 300 may be maintained with, for example, thedevice 100, thehandheld device 102, theAR device 200, and/or other device in communication with theserver 400. Data and/or signals may be received via thecommunication channel 300, and/or other communication channel through which theserver 400 receives data and/or signals. -
FIG. 6 illustrates a flowchart of an exemplary process of providing feedback based on anAR environment 500, according to an implementation of the invention. The described operations ofFIG. 6 and other Figures may be accomplished using some or all of the system components described in detail above and, in some implementations, various operations may be performed in different sequences. In other implementations, additional operations may be performed along with some or all of the operations shown inFIG. 6 and the other Figures. In yet other implementations, one or more operations may be performed simultaneously. In yet other implementations, one or more combinations of various operations may be performed. Some implementations may not perform all of the operations described with relation toFIG. 6 and other Figures. Accordingly, the operations described are exemplary in nature and, as such, should not be viewed as limiting. - In some embodiments, the operations of
FIG. 6 and other Figures may be implemented in one or more processing devices (e.g.,device 100,AR device 200,server 400,handheld device 102, and/or other devices). The one or more processing devices may include one or more devices executing some or all of the operations ofFIG. 6 and other Figures in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations ofFIG. 6 and other Figures. - In an
operation 602, acommunication channel 300 may be established between thedevice 100 and theAR device 200. - In an
operation 604, an occurrence of an event may be detected in theAR environment 500. One or both of theobject recognition module 211 and theobject generation module 212 of theprocessor 210 of theAR device 200 may be configured to facilitate the generation of theAR environment 500 including theAR space 510 coincident with thephysical space 520. Theevent handler module 213 may be configured to determine an event within theAR environment 500 based on information received from one or more of theprocessor 210 or theimaging device 220. The event may include an interaction between thedevice 100 and theAR environment 500, a confirmation of an action occurring with respect to theAR environment 500, a confirmation that thedevice 100 is recognized by theAR device 200, an interaction between thedevice 100 and one or more virtual objects displayed in theAR space 510, an interaction between a user and the AR environment, and/or other occurrence in theAR environment 500. - In an
operation 606, adevice 100 may receive a control signal from theAR device 200. When theevent handler module 213 of theAR device 200 detects an event, the controlsignal generation module 214 may be configured to determine a control signal associated with the detected event and communicate the control signal to thedevice 100 via the wired orwireless communication channel 300. Thefeedback control module 111 of theprocessor 110 may receive the control signal. The control signal may be based on the detection of the occurrence of the event in theAR environment 500. - In an
operation 608, feedback may be provided via thedevice 100. Thefeedback control module 111 may cause thefeedback device 120 to provide feedback via thedevice 100. In some implementations, thefeedback control module 111 may be configured to provide the control signal to thefeedback device 120. In these embodiments, the control signal may be directly applied to thefeedback device 120 to cause the feedback. In some implementations, thefeedback control module 111 may be configured to determine a feedback response based on the received control signal. Thefeedback control module 111 may provide a feedback response comprising one or more types of feedback and one or more feedback signals of the indicated feedback types to be generated. Thefeedback control module 111 may be configured to generate the indicated feedback signals of the feedback response and transmit the feedback signals to therespective feedback devices 120 to which the signals correspond. - One or more feedback devices 1,20 of the
device 100 may provide feedback via thedevice 100. In some implementations, the feedback provided may be based on one or more feedback signals received from thefeedback control module 111. The feedback may correspond to the event detected within theAR environment 500. For example, the feedback may be representative of the event. -
FIG. 7 illustrates a flowchart of an exemplary process of providing feedback based on an AR environment, according to an implementation of the invention. - In an
operation 702, acommunication channel 300 may be established between thedevice 100 and theAR device 200. - In an
operation 704, thedevice 100 may be detected within anAR environment 500 of theAR device 200. For example, theimaging device 220 of theAR device 200 may capture an image in thephysical space 520 of theAR environment 500. Theobject recognition module 211 of theprocessor 210 of theAR device 200 may detect thedevice 100 based on image data captured by theimaging device 220. In some implementations, thedevice 100 may detect an object in the image and may determine whether the object is the device based on the data associated with the detected object. In some implementations, theobject recognition module 211 may detect thedevice 100 based on an identifying indicia disposed at thedevice 100. In some implementations, theprocessor 210 of the AR device may detect thedevice 100 based on an identifying indicia of thedevice 100. The identifying indicia may comprise a visible optical signature (e.g., an optical signature that is within visible wavelengths of light) or a non-visible signature (e.g., an optical signature that is not within the visible wavelengths of light). In some implementations, an AR symbol may be disposed on a surface of thedevice 100. Theobject recognition module 211 may detect thedevice 100 based on the AR symbol disposed on thedevice 100. In some implementations,processor 210 of the AR device may detect thedevice 100 based on an audible signature, an infrared signature, and/or other signature emitted by thedevice 100. - In an
operation 706, thedevice 100 may receive, from theAR device 200, a control signal that indicates recognition of thedevice 100 within theAR environment 500. When one or more of theobject recognition module 211 or theprocessor 210 identifies thedevice 100 within theAR environment 500, data indicating the recognition of thedevice 100 may be sent to theevent handler module 213. Theevent handler module 213 may compare the data received from theobject recognition module 211 with data stored in the storage to determine whether the data is associated with an event. When some or all of the received information is associated with an event that comprises recognition of thedevice 100, theevent handler module 213 may transmit that event data including the received information and data relating to the recognition of thedevice 100 to the controlsignal generation module 214. The controlsignal generation module 214 may be configured to receive the event data from theevent handler module 213 and generate a control signal based on the event data for transmission to thedevice 100. - The
feedback control module 111 of theprocessor 110 may receive the control signal generated by the controlsignal generation module 214. The control signal may indicate the detection of the occurrence of the event in theAR environment 500. - In an
operation 708, feedback indicating recognition of thedevice 100 within theAR environment 500 may be provided via thedevice 100. Thefeedback control module 111 may cause thefeedback device 120 to provide feedback via thedevice 100. In some implementations, thefeedback control module 111 may be configured to provide the control signal to thefeedback device 120. In these embodiments, the control signal may be directly applied to thefeedback device 120 to cause the feedback. In some implementations, thefeedback control module 111 may be configured to determine a feedback response based on the received control signal. Thefeedback control module 111 may be configured to generate indicated feedback signals of the feedback response and transmit the feedback signals to therespective feedback devices 120 to which the signals correspond. - Feedback may be provided via the
device 100. For example, one ormore feedback devices 120 of thedevice 100 may provide feedback via thedevice 100. In some implementations, the feedback provided may be based on one or more feedback signals received from thefeedback control module 111. The feedback may correspond to the event detected within theAR environment 500. For example, the feedback may be representative of the recognition of thedevice 100 within theAR environment 500. - In an
operation 710, an interaction between thedevice 100 and theAR environment 500 may be detected. For example, an interaction may be detected based on input received from thedevice 100 via theinput component 150. An interaction may be detected based on a recognition of thedevice 100 in theAR environment 500, a movement of thedevice 100 in theAR environment 500, an interaction of thedevice 100 with aphysical object 520 a in thephysical space 520 of theAR environment 500, an interaction of thedevice 100 with avirtual object 510 a in thevirtual space 510 of theAR environment 500, a feedback provided by thedevice 100 in the AR environment 500 (e.g., a visual feedback provided in thephysical space 510 and/or other feedback recognizable by the AR device 200), a recognition of a user in theAR environment 500, a movement of a user in theAR environment 500, an interaction of a user with aphysical object 520 a in thephysical space 520 of theAR environment 500, an interaction of a user with avirtual object 510 a in thevirtual space 510 of the AR environment, and/or another interaction between thedevice 100 and theAR environment 500. - In an
operation 712, thedevice 100 may receive, from theAR device 200, a control signal indicating an interaction between thedevice 100 and theAR environment 500. When the AR device 200 (including, for example, objectrecognition module 211,event handler module 213, controlsignal generation module 214, and/or other modules or devices of the AR device 200) detects an interaction between thedevice 100 and theAR enviromnent 500, the AR device 200 (including, for example, theevent handler module 213, the controlsignal generation module 214, and/or other modules or devices of the AR device 200) may generate a control signal based on the detection of the interaction for transmission to thedevice 100. - The
feedback control module 111 of theprocessor 110 may receive the control signal generated by theAR device 200. The control signal may be based on the detection of the interaction between thedevice 100 and theAR enviromnent 500. - In an
operation 714, feedback indicating the interaction may be provided via thedevice 100. Thefeedback control module 111 may cause thefeedback device 120 to provide feedback via thedevice 100. In some implementations, thefeedback control module 111 may be configured to provide the control signal to thefeedback device 120. In these embodiments, the control signal may be directly applied to thefeedback device 120 to cause the feedback. In some implementations, thefeedback control module 111 may be configured to determine a feedback response based on the received control signal. Thefeedback control module 111 may be configured to generate indicated feedback signals of the feedback response and transmit the feedback signals to therespective feedback devices 120 to which the signals correspond. - Feedback may be provided via the
device 100. For example, one ormore feedback devices 120 of thedevice 100 may provide feedback via thedevice 100. In some implementations, the feedback provided may be based on one or more feedback signals received from thefeedback control module 111. The feedback may correspond to the interaction that was detected between thedevice 100 and theAR environment 500. For example, the feedback may be representative of the interaction between thedevice 100 and theAR enviromnent 500. - In an
operation 716, a position and/or orientation of thedevice 100 within theAR enviromnent 500 may be detected. In some implementations, the position/orientation device 140 of thedevice 100 may be configured to provide theAR device 200 with a position, an orientation, or both, via thecommunication channel 300. For example, the position/orientation device 140 may comprise a gyroscope, a geospatial positioning device, a compass, and/or other orienting/positioning devices. Theprocessor 210 of theAR device 200 may be configured to determine the position of thedevice 100 and/or the orientation of thedevice 100 within theAR environment 500 based on the received position and/or orientation. In some implementations, a position indicator image and/or orientation indicator image may be disposed on thedevice 100. Theobject recognition module 211 may recognize the position indicator image and/or the orientation indicator image when recognizing that thewearable object 100 is within the view of the imaging device and/or within thephysical space 510 of theAR environment 500. The position indicator image and/or the orientation indicator image data may be processed by theobject recognition module 211, theevent handler module 213, and/or other modules of theAR device 200 to determine a position and/or an orientation of thedevice 100 within theAR environment 500. - In an
operation 718, thedevice 100 may be detected to be within a certain proximity of a virtual object in theAR environment 500. In some implementations, based on the determination of the position and/or orientation of thedevice 100 in theAR environment 500, theAR device 200 may determine whether thedevice 100 is within a certain proximity of a virtual object (e.g.,virtual object 510 n) in theAR environment 500. - In an
operation 720, thedevice 100 may receive, from theAR device 200, a control signal indicating one or more characteristics of a virtual object of theAR environment 500. Thedevice 100 may receive the control signal indicating one or more characteristics of thevirtual object 510 a. In some implementations, thedevice 100 may receive the control signal based on the detection of the device within a certain proximity of avirtual object 510 a. In some implementations, thedevice 100 may receive the control signal separately from a detection of a position of thedevice 100 respective to thevirtual object 510 a. - In an
operation 722, feedback coordinating with the virtual object may be provided via thedevice 100. Thefeedback control module 111 of theprocessor 110 may receive the control signal indicating one or more characteristics of thevirtual object 510 a. Thefeedback control module 111 may cause thefeedback device 120 to provide feedback via thedevice 100 based on the one or more characteristics of thevirtual object 510 a. Thefeedback control module 111 may be configured to determine a feedback response based on the received control signal. Thefeedback control module 111 may be configured to generate indicated feedback signals of the feedback response and transmit the feedback signals to therespective feedback devices 120 to which the signals correspond. The feedback signals may cause therespective feedback devices 120 to provide feedback corresponding to the one or more characteristics of thevirtual object 510 a. - One or
more feedback devices 120 of thedevice 100 may provide feedback via thedevice 100 that corresponds to the one or more characteristics of thevirtual object 510 a. For example, when the one or more characteristics of thevirtual object 510 a comprise a color, a pattern, and/or other visual characteristics, thefeedback devices 120 may provide feedback of a same color, of a corresponding visual, audio and/or haptic pattern, and/or other corresponding feedback. In some implementations, thefeedback devices 120 may provide feedback via a light-emitting band at a portion of thedevice 100. The light-emitting band may include a color, a pattern, and/or other visual characteristics that coordinate with thevirtual object 510 a. - In an operation 724, an interaction between a user and the
AR environment 500 may be detected. For example, an interaction may be detected based on input received from theAR device 200 via theimaging device 220, thecommunication port 230, and/or other device. An interaction may be detected based on a recognition of a user in theAR environment 500, a movement of a user in theAR environment 500, an interaction of a user with aphysical object 520 a in thephysical space 520 of theAR environment 500, an interaction of a user with avirtual object 510 a in thevirtual space 510 of theAR environment 500, a detection of a user within a certain physical proximity of aphysical object 520 a in thephysical space 520 of theAR environment 500, a detection of a user within a certain proximity of avirtual object 510 a in thevirtual space 510 of theAR environment 500, and/or another interaction between the user and theAR environment 500. In some implementations, the AR device 200 (including, for example, objectrecognition module 211,event handler module 213, controlsignal generation module 214, and/or other modules or devices of the AR device 200) may only detect interactions between users that have information associated with thedevice 100 and/or theAR device 200. In some implementations, the AR device 200 (including, for example, objectrecognition module 211,event handler module 213, controlsignal generation module 214, and/or other modules or devices of the AR device 200) may detect an interaction between any user and theAR environment 500. - In an
operation 726, thedevice 100 may receive, from theAR device 200, a control signal indicating an interaction between the user and theAR environment 500. When the AR device 200 (including, for example, objectrecognition module 211,event handler module 213, controlsignal generation module 214, and/or other modules or devices of the AR device 200) detects an interaction between the user and theAR environment 500, the AR device 200 (including, for example, theevent handler module 213, the controlsignal generation module 214, and/or other modules or devices of the AR device 200) may generate a control signal based on the detection of the interaction for transmission to thedevice 100. - The
feedback control module 111 of theprocessor 110 may receive the control signal generated by theAR device 200. The control signal may be based on the detection of the interaction between the user and theAR environment 500. - In an
operation 728, feedback indicating the interaction may be provided via thedevice 100. Thefeedback control module 111 may cause thefeedback device 120 to provide feedback via thedevice 100. In some implementations, thefeedback control module 111 may be configured to provide the control signal to thefeedback device 120. In these embodiments, the control signal may be directly applied to thefeedback device 120 to cause the feedback. In some implementations, thefeedback control module 111 may be configured to determine a feedback response based on the received control signal. Thefeedback control module 111 may be configured to generate indicated feedback signals of the feedback response and transmit the feedback signals to therespective feedback devices 120 to which the signals correspond. - Feedback may be provided via the
device 100. For example, one ormore feedback devices 120 of thedevice 100 may provide feedback via thedevice 100. In some implementations, the feedback provided may be based on one or more feedback signals received from thefeedback control module 111. The feedback may correspond to the interaction that was detected between thedevice 100 and theAR enviromnent 500. For example, the feedback may be representative of the interaction between thedevice 100 and theAR environment 500. Although the system(s) and/or method(s) of this disclosure have been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.
Claims (24)
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/741,826 US20140198130A1 (en) | 2013-01-15 | 2013-01-15 | Augmented reality user interface with haptic feedback |
JP2014001010A JP6271256B2 (en) | 2013-01-15 | 2014-01-07 | Augmented reality user interface with haptic feedback |
KR1020140005006A KR20140092267A (en) | 2013-01-15 | 2014-01-15 | Augmented reality user interface with haptic feedback |
CN201910688902.1A CN110262666A (en) | 2013-01-15 | 2014-01-15 | Augmented reality user interface with touch feedback |
CN201410018078.6A CN103970265B (en) | 2013-01-15 | 2014-01-15 | Augmented reality user interface with touch feedback |
EP14151258.2A EP2755113A3 (en) | 2013-01-15 | 2014-01-15 | A system of providing feedback based on an augmented reality environment |
JP2017250980A JP6713445B2 (en) | 2013-01-15 | 2017-12-27 | Augmented Reality User Interface with Tactile Feedback |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/741,826 US20140198130A1 (en) | 2013-01-15 | 2013-01-15 | Augmented reality user interface with haptic feedback |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140198130A1 true US20140198130A1 (en) | 2014-07-17 |
Family
ID=49958279
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/741,826 Abandoned US20140198130A1 (en) | 2013-01-15 | 2013-01-15 | Augmented reality user interface with haptic feedback |
Country Status (5)
Country | Link |
---|---|
US (1) | US20140198130A1 (en) |
EP (1) | EP2755113A3 (en) |
JP (2) | JP6271256B2 (en) |
KR (1) | KR20140092267A (en) |
CN (2) | CN110262666A (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120326999A1 (en) * | 2011-06-21 | 2012-12-27 | Northwestern University | Touch interface device and method for applying lateral forces on a human appendage |
US20140267013A1 (en) * | 2013-03-15 | 2014-09-18 | Immersion Corporation | User interface device provided with surface haptic sensations |
US20150185848A1 (en) * | 2013-12-31 | 2015-07-02 | Immersion Corporation | Friction augmented controls and method to convert buttons of touch control panels to friction augmented controls |
US20160189427A1 (en) * | 2014-12-31 | 2016-06-30 | Immersion Corporation | Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications |
US9690370B2 (en) | 2014-05-05 | 2017-06-27 | Immersion Corporation | Systems and methods for viewport-based augmented reality haptic effects |
WO2017127128A1 (en) * | 2016-01-22 | 2017-07-27 | Elwha Llc | Feedback for enhanced situational awareness |
US9785238B2 (en) | 2008-07-15 | 2017-10-10 | Immersion Corporation | Systems and methods for transmitting haptic messages |
US20170300141A1 (en) * | 2014-10-31 | 2017-10-19 | Sony Corporation | Electronic device and feedback providing method |
US20180046251A1 (en) * | 2014-12-04 | 2018-02-15 | Immersion Corporation | Device and Method for Controlling Haptic Signals |
EP3342489A1 (en) * | 2016-12-30 | 2018-07-04 | Immersion Corporation | Inertial haptic actuators having a cantilevered beam and a smart material |
US20180365466A1 (en) * | 2017-06-20 | 2018-12-20 | Lg Electronics Inc. | Mobile terminal |
CN109917911A (en) * | 2019-02-20 | 2019-06-21 | 西北工业大学 | A design method of vibrotactile feedback device based on information-physical interaction |
WO2019213272A1 (en) * | 2018-05-01 | 2019-11-07 | Jimenez Ronald W | Simulated reality technologies for enhanced medical protocol training |
CN110442233A (en) * | 2019-06-18 | 2019-11-12 | 中国人民解放军军事科学院国防科技创新研究院 | A kind of augmented reality key mouse system based on gesture interaction |
US10852827B1 (en) | 2019-03-25 | 2020-12-01 | Facebook Technologies, Llc | Tactile simulation of initial contact with virtual objects |
US10921896B2 (en) | 2015-03-16 | 2021-02-16 | Facebook Technologies, Llc | Device interaction in augmented reality |
US10974138B2 (en) | 2016-12-08 | 2021-04-13 | Immersion Corporation | Haptic surround functionality |
US20210385299A1 (en) * | 2016-01-25 | 2021-12-09 | Hiscene Information Technology Co., Ltd | Method and apparatus for augmented reality interaction and presentation |
US11726557B2 (en) | 2017-03-21 | 2023-08-15 | Interdigital Vc Holdings, Inc. | Method and system for the detection and augmentation of tactile interactions in augmented reality |
US11875693B2 (en) | 2018-05-01 | 2024-01-16 | Codescribe Corporation | Simulated reality technologies for enhanced medical protocol training |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016078919A1 (en) * | 2014-11-18 | 2016-05-26 | Koninklijke Philips N.V. | User guidance system and method, use of an augmented reality device |
CN105739671A (en) * | 2014-12-08 | 2016-07-06 | 北京蚁视科技有限公司 | Vibration feedback device and near-eye display with device |
WO2016097841A2 (en) * | 2014-12-16 | 2016-06-23 | Quan Xiao | Methods and apparatus for high intuitive human-computer interface and human centric wearable "hyper" user interface that could be cross-platform / cross-device and possibly with local feel-able/tangible feedback |
US10331214B2 (en) | 2015-09-08 | 2019-06-25 | Sony Corporation | Information processing device, method, and computer program |
US9857874B2 (en) * | 2015-11-03 | 2018-01-02 | Chunghwa Picture Tubes, Ltd. | Augmented reality system and augmented reality interaction method |
CN106997235B (en) * | 2016-01-25 | 2018-07-13 | 亮风台(上海)信息科技有限公司 | For realizing method, the equipment of augmented reality interaction and displaying |
DE102016113060A1 (en) | 2016-07-15 | 2018-01-18 | Beckhoff Automation Gmbh | Method for controlling an object |
KR20180019270A (en) * | 2016-08-16 | 2018-02-26 | (주)알비케이이엠디 | Haptic device connected to AR device or VR device |
CN110678827B (en) * | 2017-06-08 | 2023-11-10 | 霍尼韦尔国际公司 | Apparatus and method for recording and playback of interactive content with augmented/virtual reality in industrial automation systems and other systems |
CN109147001A (en) * | 2017-06-19 | 2019-01-04 | 亮风台(上海)信息科技有限公司 | A kind of method and apparatus of nail virtual for rendering |
CN107422942A (en) * | 2017-08-15 | 2017-12-01 | 吴金河 | A kind of control system and method for immersion experience |
US20190155387A1 (en) * | 2017-11-22 | 2019-05-23 | Immersion Corporation | Haptic Accessory Apparatus |
JP6453501B1 (en) * | 2018-02-01 | 2019-01-16 | 株式会社Cygames | Mixed reality system, program, method, and portable terminal device |
US10572016B2 (en) * | 2018-03-06 | 2020-02-25 | Microsoft Technology Licensing, Llc | Spatialized haptic device force feedback |
US20190324538A1 (en) * | 2018-04-20 | 2019-10-24 | Immersion Corporation | Haptic-enabled wearable device for generating a haptic effect in an immersive reality environment |
EP3599538B1 (en) * | 2018-07-24 | 2023-07-19 | Nokia Technologies Oy | Method and apparatus for adding interactive objects to a virtual reality environment |
CN110928472B (en) * | 2018-09-19 | 2023-05-05 | 阿里巴巴集团控股有限公司 | Article processing method and device and electronic equipment |
KR20200033388A (en) * | 2018-09-20 | 2020-03-30 | (주)리얼감 | Force feedback method and system, machine-readable storage medium |
KR102467436B1 (en) * | 2020-11-13 | 2022-11-16 | (주)에이트원 | System of virtual training for industrial safety and method thereof |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US1758156A (en) * | 1928-12-27 | 1930-05-13 | Frederick W Huber | Method and means for cementing oil wells |
US20040189675A1 (en) * | 2002-12-30 | 2004-09-30 | John Pretlove | Augmented reality system and method |
US20070035511A1 (en) * | 2005-01-25 | 2007-02-15 | The Board Of Trustees Of The University Of Illinois. | Compact haptic and augmented virtual reality system |
US20090195538A1 (en) * | 2008-02-04 | 2009-08-06 | Gwangju Institute Of Science And Technology | Method and system for haptic interaction in augmented reality |
US20110102340A1 (en) * | 2001-11-01 | 2011-05-05 | Immersion Corporation | Method And Apparatus For Providing Tactile Sensations |
US20110121953A1 (en) * | 2009-11-24 | 2011-05-26 | Immersion Corporation | Handheld Computer Interface with Haptic Feedback |
US20110141052A1 (en) * | 2009-12-10 | 2011-06-16 | Jeffrey Traer Bernstein | Touch pad with force sensors and actuator feedback |
US20110279249A1 (en) * | 2009-05-29 | 2011-11-17 | Microsoft Corporation | Systems and methods for immersive interaction with virtual objects |
US20110310227A1 (en) * | 2010-06-17 | 2011-12-22 | Qualcomm Incorporated | Mobile device based content mapping for augmented reality environment |
US20120139828A1 (en) * | 2009-02-13 | 2012-06-07 | Georgia Health Sciences University | Communication And Skills Training Using Interactive Virtual Humans |
US20130002425A1 (en) * | 2011-07-01 | 2013-01-03 | General Electric Company | Augmented reality excessive noise display and warning system |
US20130117377A1 (en) * | 2011-10-28 | 2013-05-09 | Samuel A. Miller | System and Method for Augmented and Virtual Reality |
US20130265300A1 (en) * | 2011-07-03 | 2013-10-10 | Neorai Vardi | Computer device in form of wearable glasses and user interface thereof |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002082751A (en) * | 2000-09-08 | 2002-03-22 | Mitsubishi Electric Corp | Device for interaction with virtual space and virtual space system applied with the same |
JP2003316493A (en) * | 2002-04-19 | 2003-11-07 | Seiko Epson Corp | Tactile presentation device |
US20060017654A1 (en) * | 2004-07-23 | 2006-01-26 | Romo Justin R | Virtual reality interactivity system and method |
JP4926799B2 (en) * | 2006-10-23 | 2012-05-09 | キヤノン株式会社 | Information processing apparatus and information processing method |
AU2008290211A1 (en) * | 2007-08-19 | 2009-02-26 | Ringbow Ltd. | Finger-worn devices and related methods of use |
JP4989383B2 (en) * | 2007-09-10 | 2012-08-01 | キヤノン株式会社 | Information processing apparatus and information processing method |
US20090289955A1 (en) * | 2008-05-22 | 2009-11-26 | Yahoo! Inc. | Reality overlay device |
JP5344388B2 (en) * | 2008-07-10 | 2013-11-20 | 学校法人立命館 | Operation system |
US8747116B2 (en) * | 2008-08-21 | 2014-06-10 | Lincoln Global, Inc. | System and method providing arc welding training in a real-time simulated virtual reality environment using real-time weld puddle feedback |
CN102203695B (en) * | 2008-10-27 | 2015-09-02 | 索尼电脑娱乐公司 | For transmitting the opertaing device of visual information |
CN101819462B (en) * | 2010-03-12 | 2011-07-20 | 东南大学 | Image texture haptic representation system based on force/haptic interaction equipment |
AU2012253797B2 (en) * | 2011-05-06 | 2017-06-22 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
-
2013
- 2013-01-15 US US13/741,826 patent/US20140198130A1/en not_active Abandoned
-
2014
- 2014-01-07 JP JP2014001010A patent/JP6271256B2/en not_active Expired - Fee Related
- 2014-01-15 CN CN201910688902.1A patent/CN110262666A/en active Pending
- 2014-01-15 KR KR1020140005006A patent/KR20140092267A/en not_active Ceased
- 2014-01-15 CN CN201410018078.6A patent/CN103970265B/en not_active Expired - Fee Related
- 2014-01-15 EP EP14151258.2A patent/EP2755113A3/en not_active Ceased
-
2017
- 2017-12-27 JP JP2017250980A patent/JP6713445B2/en not_active Expired - Fee Related
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US1758156A (en) * | 1928-12-27 | 1930-05-13 | Frederick W Huber | Method and means for cementing oil wells |
US20110102340A1 (en) * | 2001-11-01 | 2011-05-05 | Immersion Corporation | Method And Apparatus For Providing Tactile Sensations |
US20040189675A1 (en) * | 2002-12-30 | 2004-09-30 | John Pretlove | Augmented reality system and method |
US20070035511A1 (en) * | 2005-01-25 | 2007-02-15 | The Board Of Trustees Of The University Of Illinois. | Compact haptic and augmented virtual reality system |
US20090195538A1 (en) * | 2008-02-04 | 2009-08-06 | Gwangju Institute Of Science And Technology | Method and system for haptic interaction in augmented reality |
US20120139828A1 (en) * | 2009-02-13 | 2012-06-07 | Georgia Health Sciences University | Communication And Skills Training Using Interactive Virtual Humans |
US20110279249A1 (en) * | 2009-05-29 | 2011-11-17 | Microsoft Corporation | Systems and methods for immersive interaction with virtual objects |
US20110121953A1 (en) * | 2009-11-24 | 2011-05-26 | Immersion Corporation | Handheld Computer Interface with Haptic Feedback |
US20110141052A1 (en) * | 2009-12-10 | 2011-06-16 | Jeffrey Traer Bernstein | Touch pad with force sensors and actuator feedback |
US20110310227A1 (en) * | 2010-06-17 | 2011-12-22 | Qualcomm Incorporated | Mobile device based content mapping for augmented reality environment |
US20130002425A1 (en) * | 2011-07-01 | 2013-01-03 | General Electric Company | Augmented reality excessive noise display and warning system |
US20130265300A1 (en) * | 2011-07-03 | 2013-10-10 | Neorai Vardi | Computer device in form of wearable glasses and user interface thereof |
US20130117377A1 (en) * | 2011-10-28 | 2013-05-09 | Samuel A. Miller | System and Method for Augmented and Virtual Reality |
Non-Patent Citations (1)
Title |
---|
Merrill, David and Maes, Pattie, "Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching and Browsing of Physical Objects", 2007, Springer-Verlag, "Proceedings of the 5th International Conference on Pervasive Computing", pgs. 1-18, Retrieved from internet: * |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9785238B2 (en) | 2008-07-15 | 2017-10-10 | Immersion Corporation | Systems and methods for transmitting haptic messages |
US20120326999A1 (en) * | 2011-06-21 | 2012-12-27 | Northwestern University | Touch interface device and method for applying lateral forces on a human appendage |
US10007341B2 (en) * | 2011-06-21 | 2018-06-26 | Northwestern University | Touch interface device and method for applying lateral forces on a human appendage |
US20140267013A1 (en) * | 2013-03-15 | 2014-09-18 | Immersion Corporation | User interface device provided with surface haptic sensations |
US9041647B2 (en) * | 2013-03-15 | 2015-05-26 | Immersion Corporation | User interface device provided with surface haptic sensations |
US20170291104A1 (en) * | 2013-03-15 | 2017-10-12 | Immersion Corporation | User Interface Device Provided With Surface Haptic Sensations |
US9715278B2 (en) | 2013-03-15 | 2017-07-25 | Immersion Corporation | User interface device provided with surface haptic sensations |
US20150185848A1 (en) * | 2013-12-31 | 2015-07-02 | Immersion Corporation | Friction augmented controls and method to convert buttons of touch control panels to friction augmented controls |
US10444829B2 (en) | 2014-05-05 | 2019-10-15 | Immersion Corporation | Systems and methods for viewport-based augmented reality haptic effects |
US9690370B2 (en) | 2014-05-05 | 2017-06-27 | Immersion Corporation | Systems and methods for viewport-based augmented reality haptic effects |
US9946336B2 (en) | 2014-05-05 | 2018-04-17 | Immersion Corporation | Systems and methods for viewport-based augmented reality haptic effects |
US10996776B2 (en) * | 2014-10-31 | 2021-05-04 | Sony Corporation | Electronic device and feedback providing method |
US20170300141A1 (en) * | 2014-10-31 | 2017-10-19 | Sony Corporation | Electronic device and feedback providing method |
US11726589B2 (en) * | 2014-10-31 | 2023-08-15 | Sony Corporation | Electronic device and feedback providing method |
US20210191537A1 (en) * | 2014-10-31 | 2021-06-24 | Sony Corporation | Electronic device and feedback providing method |
US10572020B2 (en) | 2014-12-04 | 2020-02-25 | Immersion Corporation | Device and method for controlling haptic signals |
US10175763B2 (en) * | 2014-12-04 | 2019-01-08 | Immersion Corporation | Device and method for controlling haptic signals |
US20180046251A1 (en) * | 2014-12-04 | 2018-02-15 | Immersion Corporation | Device and Method for Controlling Haptic Signals |
US20160189427A1 (en) * | 2014-12-31 | 2016-06-30 | Immersion Corporation | Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications |
CN105739683A (en) * | 2014-12-31 | 2016-07-06 | 意美森公司 | Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications |
US10921896B2 (en) | 2015-03-16 | 2021-02-16 | Facebook Technologies, Llc | Device interaction in augmented reality |
WO2017127128A1 (en) * | 2016-01-22 | 2017-07-27 | Elwha Llc | Feedback for enhanced situational awareness |
US12149591B2 (en) * | 2016-01-25 | 2024-11-19 | Hiscene Information Technology Co., Ltd | Method and apparatus for augmented reality interaction and presentation |
US20210385299A1 (en) * | 2016-01-25 | 2021-12-09 | Hiscene Information Technology Co., Ltd | Method and apparatus for augmented reality interaction and presentation |
US10974138B2 (en) | 2016-12-08 | 2021-04-13 | Immersion Corporation | Haptic surround functionality |
CN108269911A (en) * | 2016-12-30 | 2018-07-10 | 意美森公司 | Inertial haptic actuator with cantilever beam and intellectual material |
EP3342489A1 (en) * | 2016-12-30 | 2018-07-04 | Immersion Corporation | Inertial haptic actuators having a cantilevered beam and a smart material |
US11726557B2 (en) | 2017-03-21 | 2023-08-15 | Interdigital Vc Holdings, Inc. | Method and system for the detection and augmentation of tactile interactions in augmented reality |
US12008154B2 (en) | 2017-03-21 | 2024-06-11 | Interdigital Vc Holdings, Inc. | Method and system for the detection and augmentation of tactile interactions in augmented reality |
US10706251B2 (en) | 2017-06-20 | 2020-07-07 | Lg Electronics Inc. | Mobile terminal |
US10699094B2 (en) * | 2017-06-20 | 2020-06-30 | Lg Electronics Inc. | Mobile terminal |
US20180365466A1 (en) * | 2017-06-20 | 2018-12-20 | Lg Electronics Inc. | Mobile terminal |
US11270597B2 (en) | 2018-05-01 | 2022-03-08 | Codescribe Llc | Simulated reality technologies for enhanced medical protocol training |
WO2019213272A1 (en) * | 2018-05-01 | 2019-11-07 | Jimenez Ronald W | Simulated reality technologies for enhanced medical protocol training |
US11875693B2 (en) | 2018-05-01 | 2024-01-16 | Codescribe Corporation | Simulated reality technologies for enhanced medical protocol training |
US12183215B2 (en) | 2018-05-01 | 2024-12-31 | Codescribe Corporation | Simulated reality technologies for enhanced medical protocol training |
CN109917911A (en) * | 2019-02-20 | 2019-06-21 | 西北工业大学 | A design method of vibrotactile feedback device based on information-physical interaction |
US10852827B1 (en) | 2019-03-25 | 2020-12-01 | Facebook Technologies, Llc | Tactile simulation of initial contact with virtual objects |
US11397467B1 (en) | 2019-03-25 | 2022-07-26 | Facebook Technologies, Llc | Tactile simulation of initial contact with virtual objects |
CN110442233A (en) * | 2019-06-18 | 2019-11-12 | 中国人民解放军军事科学院国防科技创新研究院 | A kind of augmented reality key mouse system based on gesture interaction |
Also Published As
Publication number | Publication date |
---|---|
JP2018073433A (en) | 2018-05-10 |
KR20140092267A (en) | 2014-07-23 |
CN110262666A (en) | 2019-09-20 |
JP6713445B2 (en) | 2020-06-24 |
JP6271256B2 (en) | 2018-01-31 |
CN103970265B (en) | 2019-08-23 |
EP2755113A3 (en) | 2016-12-28 |
CN103970265A (en) | 2014-08-06 |
JP2014139784A (en) | 2014-07-31 |
EP2755113A2 (en) | 2014-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140198130A1 (en) | Augmented reality user interface with haptic feedback | |
US11045725B1 (en) | Controller visualization in virtual and augmented reality environments | |
US10269222B2 (en) | System with wearable device and haptic output device | |
CN110476168B (en) | Method and system for hand tracking | |
US10083544B2 (en) | System for tracking a handheld device in virtual reality | |
US8179604B1 (en) | Wearable marker for passive interaction | |
CN105900041B (en) | It is positioned using the target that eye tracking carries out | |
US20160180594A1 (en) | Augmented display and user input device | |
US9569001B2 (en) | Wearable gestural interface | |
JP2020091904A (en) | System and controller | |
US11782514B2 (en) | Wearable device and control method thereof, gesture recognition method, and control system | |
US20160189427A1 (en) | Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications | |
US20180188894A1 (en) | Virtual Touchpads For Wearable And Portable Devices | |
CN111857365B (en) | Stylus-based input for head-mounted devices | |
KR101343748B1 (en) | Transparent display virtual touch apparatus without pointer | |
US20190049558A1 (en) | Hand Gesture Recognition System and Method | |
CN107710009B (en) | Controller visualization in virtual and augmented reality environments | |
US11054941B2 (en) | Information processing system, information processing method, and program for correcting operation direction and operation amount | |
US20230136028A1 (en) | Ergonomic eyes-off-hand multi-touch input | |
US20240168565A1 (en) | Single-handed gestures for reviewing virtual content | |
WO2024049463A1 (en) | Virtual keyboard | |
WO2025072024A1 (en) | Devices, methods, and graphical user interfaces for processing inputs to a three-dimensional environment | |
KR20160062906A (en) | augmented reality Input Method for Wearable device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: IMMERSION CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LACROIX, ROBERT;REEL/FRAME:029631/0538 Effective date: 20130111 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |