Canessa et al., 2014 - Google Patents
Calibrated depth and color cameras for accurate 3D interaction in a stereoscopic augmented reality environmentCanessa et al., 2014
- Document ID
- 16672873129279880727
- Author
- Canessa A
- Chessa M
- Gibaldi A
- Sabatini S
- Solari F
- Publication year
- Publication venue
- Journal of Visual Communication and Image Representation
External Links
Snippet
A Human–machine interaction system requires precise information about the user's body position, in order to allow a natural 3D interaction in stereoscopic augmented reality environments, where real and virtual objects should coherently coexist. The diffusion of RGB …
- 230000003993 interaction 0 title abstract description 24
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Canessa et al. | Calibrated depth and color cameras for accurate 3D interaction in a stereoscopic augmented reality environment | |
| US11928838B2 (en) | Calibration system and method to align a 3D virtual scene and a 3D real world for a stereoscopic head-mounted display | |
| US11652965B2 (en) | Method of and system for projecting digital information on a real object in a real environment | |
| JP6690041B2 (en) | Method and device for determining point of gaze on three-dimensional object | |
| CN109801379B (en) | Universal augmented reality glasses and calibration method thereof | |
| Mansouryar et al. | 3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers | |
| Holloway | Registration error analysis for augmented reality | |
| Harders et al. | Calibration, registration, and synchronization for high precision augmented reality haptics | |
| US20200363867A1 (en) | Blink-based calibration of an optical see-through head-mounted display | |
| US20140218281A1 (en) | Systems and methods for eye gaze determination | |
| Azimi et al. | Alignment of the virtual scene to the tracking space of a mixed reality head-mounted display | |
| Itoh et al. | Light-field correction for spatial calibration of optical see-through head-mounted displays | |
| Ballestin et al. | A registration framework for the comparison of video and optical see-through devices in interactive augmented reality | |
| Chi et al. | A novel multi-camera global calibration method for gaze tracking system | |
| Lee et al. | A computer vision system for on-screen item selection by finger pointing | |
| Solari et al. | Natural perception in dynamic stereoscopic augmented reality environments | |
| Wang et al. | Accuracy of monocular gaze tracking on 3d geometry | |
| Hua et al. | A testbed for precise registration, natural occlusion and interaction in an augmented environment using a head-mounted projective display (HMPD) | |
| Madritsch et al. | CCD‐Camera Based Optical Beacon Tracking for Virtual and Augmented Reality | |
| TWI664576B (en) | Head mounted device and control method | |
| Liu et al. | A portable projection mapping device for medical augmented reality in single-stage cranioplasty | |
| Plopski et al. | Tracking systems: Calibration, hardware, and peripherals | |
| Wang et al. | Inverse visualization concept for RGB-D augmented C-arms | |
| US20240331329A1 (en) | Method and system for superimposing two-dimensional (2d) images over deformed surfaces | |
| Ravi et al. | A study of object recognition and tracking techniques for augmented reality applications |