WO2013017991A1 - Télécommande avec des premier et second capteurs - Google Patents
Télécommande avec des premier et second capteurs Download PDFInfo
- Publication number
- WO2013017991A1 WO2013017991A1 PCT/IB2012/053771 IB2012053771W WO2013017991A1 WO 2013017991 A1 WO2013017991 A1 WO 2013017991A1 IB 2012053771 W IB2012053771 W IB 2012053771W WO 2013017991 A1 WO2013017991 A1 WO 2013017991A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- remote control
- sensor
- information
- respect
- sensor information
- Prior art date
Links
- 230000001133 acceleration Effects 0.000 claims abstract description 25
- 238000001514 detection method Methods 0.000 claims description 19
- 238000000034 method Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 7
- 230000003287 optical effect Effects 0.000 description 5
- 238000001914 filtration Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/32—Remote control based on movements, attitude of remote control device
Definitions
- the invention relates to an apparatus for converting control information from a remote control into a control instruction for a device.
- the invention further relates to a remote control comprising the apparatus, to a device comprising the apparatus, to a method for converting control information from a remote control into a control instruction for a device, to a computer program product for performing the step of the method, and to a medium for storing and comprising the computer program product.
- Examples of such a device are devices with displays and other devices that are to be controlled remotely.
- WO 2004 / 047011 and US 2010 / 0157033 disclose pointing devices with cameras.
- an apparatus for converting control information from a remote control into a control instruction for a device, the remote control comprising a first sensor for converting image information into first sensor information and comprising a second sensor for converting geometrical information into second sensor information, the control information comprising a combination of the first sensor information and the second sensor information.
- the apparatus may form part of the remote control or may form part of the device or may be located between the remote control and the device.
- the apparatus By having provided the remote control with at least two different sensors, and by letting the apparatus convert the control information from the remote control into the control instruction for the device, which control information comprises the combination of the first and second sensor information from the different sensors, an improved apparatus has been created.
- Such an improved apparatus offers more possibilities owing to the fact that different sensors are used in combination.
- An embodiment of the apparatus is defined by the apparatus being arranged to make the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
- control information comprises the first sensor information that has been checked by having used the second sensor information.
- control information comprises the first sensor information that has been completed by having used the second sensor information.
- control information comprises the first sensor information that has been corrected by having used the second sensor information.
- control information comprises the second sensor information that has overruled the first sensor information.
- An embodiment of the apparatus is defined by the first sensor comprising a camera, the image information comprising an image showing a single beacon, two or more beacons, a beacon and a non-beacon and/or the device or a part thereof, and the first sensor information comprising a detection of the single beacon, the two or more beacons, the beacon and the non-beacon and/or the device or the part thereof.
- An embodiment of the apparatus is defined by the second sensor comprising an acceleration detector, a motion detector, a movement detector, an angle detector, a tilt detector, an orientation detector and/or a rotation detector, the geometrical information comprising an acceleration in at least one direction, a motion in at least one direction, a movement in at least one direction, an angle with respect to at least one direction, a tilt with respect to at least one direction, an orientation with respect to at least one direction and/or a rotation with respect to at least one direction, and the second sensor information comprising a detection of the acceleration in the at least one direction, the motion in the at least one direction, the movement in the at least one direction, the angle with respect to the at least one direction, the tilt with respect to the at least one direction, the orientation with respect to the at least one direction and/or the rotation with respect to the at least one direction.
- a combination of a first sensor in the form of a camera and a second sensor in the form of one of the detectors defined above has proven to be advantageous.
- An embodiment of the apparatus is defined by the control instruction comprising a pointing position on the device, a distance between the remote control and the device, a location of the remote control with respect to the device, an acceleration of the remote control in at least one direction, a motion of the remote control in at least one direction, a movement of the remote control in at least one direction, an angle of the remote control with respect to at least one direction, a tilt of the remote control with respect to at least one direction, an orientation of the remote control with respect to at least one direction and/or a rotation of the remote control with respect to at least one direction.
- Such an apparatus forms, together with the remote control and the device, a gesturing control system.
- An embodiment of the apparatus is defined by the apparatus providing gesture detection.
- control information comprising a combination of the first and second sensor information, an improved gesture detection has become possible.
- An embodiment of the apparatus is defined by the apparatus providing tilt compensation.
- control information comprising a combination of the first and second sensor information, an improved tilt compensation has become possible.
- a remote control comprising the apparatus as defined above.
- a device comprising the apparatus as defined above.
- a method for converting control information from a remote control into a control instruction for a device, the remote control comprising a first sensor for converting image information into first sensor information and comprising a second sensor for converting geometrical information into second sensor information, the method comprising a step of converting the control information comprising a combination of the first sensor information and the second sensor information into the control instruction.
- An embodiment of the method is defined by the method further comprising a step of making the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
- a computer program product is provided for performing the step of the method as defined above.
- a medium for storing and comprising the computer program product as defined above.
- a first sensor may provide first sensor information only, and a basic idea could be that a combination of first and second sensors may provide a combination of first and second sensor information.
- Fig. 1 shows a trajectory of a remote control comprising a first sensor in the form of a camera
- Fig. 2 shows a trajectory of a remote control comprising a second sensor in the form of an acceleration detector
- Fig. 3 shows a remote control comprising an apparatus and shows a device
- Fig. 4 shows a remote control and shows a device comprising an apparatus.
- a trajectory of a remote control comprising a first sensor in the form of a camera is shown.
- the device comprises for example a single beacon that is to be detected by the camera or the entire device is to be detected by the camera as a single light point or blob.
- fast left right movements made by a user are not detected as fully horizontal left right movements by the remote control.
- a first reason might be that the left right movements made by the user are not precisely horizontal.
- a second reason might be that the user is not making left right movements but is rotating the remote control (e.g.
- the remote control such as a handheld around an axis like a line between the handheld and the device when the handheld is kept relatively straight in front of the device) or making circular movements with the remote control (e.g. circles in a plane relatively parallel to a front side of the device).
- the remote control comprising only the first sensor in the form of the camera cannot distinguish between these two reasons.
- the remote control comprising only the first sensor in the form of the camera cannot distinguish between a rotation and a circular movement.
- a trajectory of a remote control comprising a second sensor in the form of an acceleration detector is shown. Fast left right movements made by a user are not detected as fully horizontal left right movements by the remote control at all. The faster the horizontal movements, the more the data from the acceleration detector such as an
- accelerometer will show a deviation from a horizontal movement. But also for relatively slow movements, the acceleration detector will not show a straight line.
- a remote control 10 comprising an apparatus 1 and a device 20 are shown.
- the remote control 10 comprises a first sensor 11 and comprises a second sensor 12.
- the first sensor 11 converts image information into first sensor information provided to the apparatus 1 and the second sensor 12 converts geometrical information into second sensor information provided to the apparatus 1.
- the apparatus 1 converts control information comprising a combination of the first sensor information and the second sensor information into a control instruction and supplies it to a transmitter 13.
- the transmitter 13 transmits the control instruction to a receiver 23 in the device 20.
- the receiver 23 supplies the control instruction to a device controller 24.
- one or more controllers may be present. Further alternatively, these one or more controllers may be located in the sensors 11-12 or in the apparatus 1, or the apparatus 1 may form part of such controller(s).
- the remote control 10 comprises a first sensor 11 and comprises a second sensor 12.
- the first sensor 11 converts image information into first sensor information provided to a transmitter 13 possibly via a controller 14 and the second sensor 12 converts geometrical information into second sensor information provided to the transmitter 13 possibly via the controller 14.
- the transmitter 13 transmits the first and second sensor information to a receiver 23 in the device 20.
- the receiver 23 supplies the first and second sensor information to the apparatus 1.
- the apparatus 1 converts control information comprising a combination of the first sensor information and the second sensor information into a control instruction and supplies it to a device controller 24.
- parts of the controller 14 may be located in the sensors 11-12.
- the device controller 24 may be located in the apparatus 1, or the apparatus 1 may form part of the device controller 24.
- the apparatus 1 combines the first and second sensor information by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
- the data from the Fig. 2 could be used to check the data from the Fig. 1.
- the data from the Fig. 2 could be used to complete the data from the Fig. 1.
- the correcting in case some of the data from the Fig. 1 is partly incorrect for some position or for some moments in time, the data from the Fig.
- the data from the Fig. 2 could be used to correct the data from the Fig. 1.
- the data from the Fig. 2 could be used to overrule the data from the Fig. 1.
- the first sensor 11 comprises a camera
- the image information comprises an image showing a single beacon, two or more beacons, a beacon and a non- beacon and/or the device 20 or a part thereof
- the first sensor information comprises a detection of the single beacon, the two or more beacons, the beacon and the non-beacon and/or the device 20 or the part thereof.
- the remote control 10 comprising only the first sensor 11 in the form of the camera can distinguish between pure left right movements and left right movements accompanied by some rotation, and between a pure rotation and a circular movement, the combination of the two different sensors 11 and 12 will improve that situation too.
- the second sensor 12 comprises an acceleration detector, a motion detector, a movement detector, an angle detector, a tilt detector, an orientation detector and/or a rotation detector
- the geometrical information comprises an acceleration in at least one direction, a motion in at least one direction, a movement in at least one direction, an angle with respect to at least one direction, a tilt with respect to at least one direction, an orientation with respect to at least one direction, and/or a rotation with respect to at least one direction
- the second sensor information comprises a detection of the acceleration in the at least one direction, the motion in the at least one direction, the movement in the at least one direction, the angle with respect to the at least one direction, the tilt with respect to the at least one direction, the orientation with respect to the at least one direction and/or the rotation with respect to the at least one direction.
- the control instruction comprises a pointing position on the device 20, a distance between the remote control 10 and the device 20, a location of the remote control 10 with respect to the device 20, an acceleration of the remote control 10 in at least one direction, a motion of the remote control 10 in at least one direction, a movement of the remote control 10 in at least one direction, an angle of the remote control 10 with respect to at least one direction, a tilt of the remote control 10 with respect to at least one direction, an orientation of the remote control 10 with respect to at least one direction and/or a rotation of the remote control 10 with respect to at least one direction.
- the remote control 10 comprising the first sensor 11 in the form of the camera for example translates a light spot or a blob captured from a beacon or a more precise device detection to for example a cursor position on a display of the device 20.
- a gesture made by a user who is holding the remote control 10 results in a sequence of light spot samples or blob samples or device detections on the camera sensor. By processing such sequences, a gesture can be detected.
- Gesture detection based on geometrical data only such as for example acceleration data is relatively unreliable (relatively high probability on false positives and false negatives) due to the combination of gravity and external forces.
- An acceleration based detection of simple horizontal movements is difficult and may cause latencies in detection.
- Latencies are typically introduced by filtering sensor data. The filtering is applied to suppress an impact of noise from external forces.
- geometrical data such as for example acceleration data
- optical data are combined.
- the more accurate optical trajectory information from the camera can be fully exploited whereby the acceleration data is used to derive an orientation.
- This increases the correct detection rate of gestures, where acceleration data assists in optical detection and vice versa.
- the combination of optical and acceleration based detection also results in a richer set of gestures that can be recognized compared to the "optical only” and “acceleration only” situations. Improvements are also achieved in case tilt-correction is implemented. With the combination of two or more different sensors 11-12, more different gestures can be detected more precisely.
- an apparatus 1 converts control information comprising a combination of first and second sensor information from a remote control 10 into a control instruction for a device 20.
- the remote control 10 comprises a first sensor 11 for converting image information into the first sensor information and a second sensor 12 for converting geometrical information into the second sensor information.
- the apparatus 1 may form part of the remote control 10 or the device 20 or may be located in between. At least two different sensors 11, 12 used in combination in the remote control 10 are a great advantage.
- the apparatus 1 may make the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
- the first sensor 11 may comprise a camera
- the second sensor may comprise a detector for detecting an acceleration, a motion, a movement, an angle, a tilt, an orientation and/or a rotation.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Selective Calling Equipment (AREA)
Abstract
Un appareil (1) convertit des informations de commande comprenant une combinaison d'informations d'un premier et d'un second capteur d'une première télécommande (10) en une instruction de commande pour un dispositif (20). La télécommande (10) comprend un premier capteur (11) pour convertir des informations d'images en informations de premier capteur et un second capteur (12) pour convertir des informations géométriques en informations de second capteur. L'appareil (1) peut former une partie de la télécommande (10) ou du dispositif (20) ou peut se situer entre les deux. Au moins deux capteurs différents (11, 12) utilisés en combinaison dans la télécommande (10) offrent un avantage considérable. L'appareil (1) peut effectuer la combinaison en utilisant les informations du second capteur pour vérifier, compléter, corriger et/ou annuler les informations du premier capteur. Le premier capteur (11) peut comprendre une caméra, le second capteur peut comprendre un détecteur pour détecter une accélération, un mouvement, un angle, une inclinaison, une orientation et/ou une rotation.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161514109P | 2011-08-02 | 2011-08-02 | |
US61/514,109 | 2011-08-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013017991A1 true WO2013017991A1 (fr) | 2013-02-07 |
Family
ID=47073480
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2012/053771 WO2013017991A1 (fr) | 2011-08-02 | 2012-07-25 | Télécommande avec des premier et second capteurs |
Country Status (2)
Country | Link |
---|---|
TW (1) | TW201324248A (fr) |
WO (1) | WO2013017991A1 (fr) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3529750A4 (fr) * | 2016-10-24 | 2019-10-16 | Snap Inc. | Système de suivi redondant |
US10593116B2 (en) | 2016-10-24 | 2020-03-17 | Snap Inc. | Augmented reality object manipulation |
US11195338B2 (en) | 2017-01-09 | 2021-12-07 | Snap Inc. | Surface aware lens |
US20220044479A1 (en) | 2018-11-27 | 2022-02-10 | Snap Inc. | Textured mesh building |
US11443491B2 (en) | 2019-06-28 | 2022-09-13 | Snap Inc. | 3D object camera customization system |
US11636657B2 (en) | 2019-12-19 | 2023-04-25 | Snap Inc. | 3D captions with semantic graphical elements |
US11715268B2 (en) | 2018-08-30 | 2023-08-01 | Snap Inc. | Video clip object tracking |
US11810220B2 (en) | 2019-12-19 | 2023-11-07 | Snap Inc. | 3D captions with face tracking |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI601033B (zh) * | 2014-07-08 | 2017-10-01 | 拓連科技股份有限公司 | 移動偵測之管理方法及系統,及相關電腦程式產品 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004047011A2 (fr) | 2002-11-20 | 2004-06-03 | Koninklijke Philips Electronics N.V. | Systeme d'interface utilisateur fonde sur un dispositif de pointage |
US20050212766A1 (en) * | 2004-03-23 | 2005-09-29 | Reinhardt Albert H M | Translation controlled cursor |
US20070236451A1 (en) * | 2006-04-07 | 2007-10-11 | Microsoft Corporation | Camera and Acceleration Based Interface for Presentations |
US20080068336A1 (en) * | 2006-09-19 | 2008-03-20 | Samsung Electronics Co., Ltd. | Input device and method and medium for providing movement information of the input device |
WO2010007566A1 (fr) * | 2008-07-18 | 2010-01-21 | Koninklijke Philips Electronics N.V. | Dispositif caméra et dispositif écran |
US20100157033A1 (en) | 2005-08-11 | 2010-06-24 | Koninklijke Philips Electronics, N.V. | Method of determining the motion of a pointing device |
-
2012
- 2012-07-25 WO PCT/IB2012/053771 patent/WO2013017991A1/fr active Application Filing
- 2012-07-30 TW TW101127494A patent/TW201324248A/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004047011A2 (fr) | 2002-11-20 | 2004-06-03 | Koninklijke Philips Electronics N.V. | Systeme d'interface utilisateur fonde sur un dispositif de pointage |
US20050212766A1 (en) * | 2004-03-23 | 2005-09-29 | Reinhardt Albert H M | Translation controlled cursor |
US20100157033A1 (en) | 2005-08-11 | 2010-06-24 | Koninklijke Philips Electronics, N.V. | Method of determining the motion of a pointing device |
US20070236451A1 (en) * | 2006-04-07 | 2007-10-11 | Microsoft Corporation | Camera and Acceleration Based Interface for Presentations |
US20080068336A1 (en) * | 2006-09-19 | 2008-03-20 | Samsung Electronics Co., Ltd. | Input device and method and medium for providing movement information of the input device |
WO2010007566A1 (fr) * | 2008-07-18 | 2010-01-21 | Koninklijke Philips Electronics N.V. | Dispositif caméra et dispositif écran |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10593116B2 (en) | 2016-10-24 | 2020-03-17 | Snap Inc. | Augmented reality object manipulation |
US10692285B2 (en) | 2016-10-24 | 2020-06-23 | Snap Inc. | Redundant tracking system |
US10803664B2 (en) | 2016-10-24 | 2020-10-13 | Snap Inc. | Redundant tracking system |
US12094063B2 (en) | 2016-10-24 | 2024-09-17 | Snap Inc. | Redundant tracking system |
US11481978B2 (en) | 2016-10-24 | 2022-10-25 | Snap Inc. | Redundant tracking system |
US11580700B2 (en) | 2016-10-24 | 2023-02-14 | Snap Inc. | Augmented reality object manipulation |
EP3529750A4 (fr) * | 2016-10-24 | 2019-10-16 | Snap Inc. | Système de suivi redondant |
US11704878B2 (en) | 2017-01-09 | 2023-07-18 | Snap Inc. | Surface aware lens |
US11195338B2 (en) | 2017-01-09 | 2021-12-07 | Snap Inc. | Surface aware lens |
US12217374B2 (en) | 2017-01-09 | 2025-02-04 | Snap Inc. | Surface aware lens |
US11715268B2 (en) | 2018-08-30 | 2023-08-01 | Snap Inc. | Video clip object tracking |
US11836859B2 (en) | 2018-11-27 | 2023-12-05 | Snap Inc. | Textured mesh building |
US11620791B2 (en) | 2018-11-27 | 2023-04-04 | Snap Inc. | Rendering 3D captions within real-world environments |
US12020377B2 (en) | 2018-11-27 | 2024-06-25 | Snap Inc. | Textured mesh building |
US12106441B2 (en) | 2018-11-27 | 2024-10-01 | Snap Inc. | Rendering 3D captions within real-world environments |
US20220044479A1 (en) | 2018-11-27 | 2022-02-10 | Snap Inc. | Textured mesh building |
US11823341B2 (en) | 2019-06-28 | 2023-11-21 | Snap Inc. | 3D object camera customization system |
US11443491B2 (en) | 2019-06-28 | 2022-09-13 | Snap Inc. | 3D object camera customization system |
US12211159B2 (en) | 2019-06-28 | 2025-01-28 | Snap Inc. | 3D object camera customization system |
US11636657B2 (en) | 2019-12-19 | 2023-04-25 | Snap Inc. | 3D captions with semantic graphical elements |
US11810220B2 (en) | 2019-12-19 | 2023-11-07 | Snap Inc. | 3D captions with face tracking |
US11908093B2 (en) | 2019-12-19 | 2024-02-20 | Snap Inc. | 3D captions with semantic graphical elements |
US12175613B2 (en) | 2019-12-19 | 2024-12-24 | Snap Inc. | 3D captions with face tracking |
Also Published As
Publication number | Publication date |
---|---|
TW201324248A (zh) | 2013-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013017991A1 (fr) | Télécommande avec des premier et second capteurs | |
CN106558026B (zh) | 偏离用户界面 | |
US9134800B2 (en) | Gesture input device and gesture input method | |
EP2817694B1 (fr) | Navigation pour entrée multidimensionnelle | |
KR20110063075A (ko) | 제스처 입력 장치 및 이를 이용한 제스처 인식 방법 및 장치 | |
US9602806B1 (en) | Stereo camera calibration using proximity data | |
US9692977B2 (en) | Method and apparatus for adjusting camera top-down angle for mobile document capture | |
US10802606B2 (en) | Method and device for aligning coordinate of controller or headset with coordinate of binocular system | |
JP6372487B2 (ja) | 情報処理装置、制御方法、プログラム、および記憶媒体 | |
CN106558027B (zh) | 用于估计相机姿态中的偏离误差的方法 | |
CN103635777B (zh) | 用于追踪、测量以及标记相邻表面的边缘和转角的结构测量单元 | |
CN105320274A (zh) | 使用光追踪和相对位置检测的直接三维指向 | |
US10388027B2 (en) | Detection method, display apparatus, and detection system | |
CN104270657B (zh) | 一种信息处理方法及电子设备 | |
KR101358064B1 (ko) | 사용자 이미지를 이용한 원격 제어 방법 및 시스템 | |
EP2678847B1 (fr) | Estimation d'une caractéristique de commande d'une commande à distance munie d'une caméra | |
CN104885433B (zh) | 用于感应设备的挠曲的方法和装置 | |
KR20180106178A (ko) | 무인 비행체, 전자 장치 및 그에 대한 제어 방법 | |
JP2012194659A (ja) | ジェスチャ認識装置、ジェスチャ認識方法、及び、コンピュータプログラム | |
US20200320729A1 (en) | Information processing apparatus, method of information processing, and information processing system | |
AU2019315032B2 (en) | System for object tracking in physical space with aligned reference frames | |
JP6670682B2 (ja) | 位置検出方法及び位置検出システム | |
KR101695727B1 (ko) | 스테레오 비전을 이용한 위치검출 시스템 및 위치검출 방법 | |
US20170199587A1 (en) | Method for correcting motion sensor-related errors while interacting with mobile or wearable devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12777946 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12777946 Country of ref document: EP Kind code of ref document: A1 |