+

WO2013017991A1 - Remote control with first and second sensors - Google Patents

Remote control with first and second sensors Download PDF

Info

Publication number
WO2013017991A1
WO2013017991A1 PCT/IB2012/053771 IB2012053771W WO2013017991A1 WO 2013017991 A1 WO2013017991 A1 WO 2013017991A1 IB 2012053771 W IB2012053771 W IB 2012053771W WO 2013017991 A1 WO2013017991 A1 WO 2013017991A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote control
sensor
information
respect
sensor information
Prior art date
Application number
PCT/IB2012/053771
Other languages
French (fr)
Inventor
Petrus Augustinus Maria Van Grinsven
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2013017991A1 publication Critical patent/WO2013017991A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device

Definitions

  • the invention relates to an apparatus for converting control information from a remote control into a control instruction for a device.
  • the invention further relates to a remote control comprising the apparatus, to a device comprising the apparatus, to a method for converting control information from a remote control into a control instruction for a device, to a computer program product for performing the step of the method, and to a medium for storing and comprising the computer program product.
  • Examples of such a device are devices with displays and other devices that are to be controlled remotely.
  • WO 2004 / 047011 and US 2010 / 0157033 disclose pointing devices with cameras.
  • an apparatus for converting control information from a remote control into a control instruction for a device, the remote control comprising a first sensor for converting image information into first sensor information and comprising a second sensor for converting geometrical information into second sensor information, the control information comprising a combination of the first sensor information and the second sensor information.
  • the apparatus may form part of the remote control or may form part of the device or may be located between the remote control and the device.
  • the apparatus By having provided the remote control with at least two different sensors, and by letting the apparatus convert the control information from the remote control into the control instruction for the device, which control information comprises the combination of the first and second sensor information from the different sensors, an improved apparatus has been created.
  • Such an improved apparatus offers more possibilities owing to the fact that different sensors are used in combination.
  • An embodiment of the apparatus is defined by the apparatus being arranged to make the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
  • control information comprises the first sensor information that has been checked by having used the second sensor information.
  • control information comprises the first sensor information that has been completed by having used the second sensor information.
  • control information comprises the first sensor information that has been corrected by having used the second sensor information.
  • control information comprises the second sensor information that has overruled the first sensor information.
  • An embodiment of the apparatus is defined by the first sensor comprising a camera, the image information comprising an image showing a single beacon, two or more beacons, a beacon and a non-beacon and/or the device or a part thereof, and the first sensor information comprising a detection of the single beacon, the two or more beacons, the beacon and the non-beacon and/or the device or the part thereof.
  • An embodiment of the apparatus is defined by the second sensor comprising an acceleration detector, a motion detector, a movement detector, an angle detector, a tilt detector, an orientation detector and/or a rotation detector, the geometrical information comprising an acceleration in at least one direction, a motion in at least one direction, a movement in at least one direction, an angle with respect to at least one direction, a tilt with respect to at least one direction, an orientation with respect to at least one direction and/or a rotation with respect to at least one direction, and the second sensor information comprising a detection of the acceleration in the at least one direction, the motion in the at least one direction, the movement in the at least one direction, the angle with respect to the at least one direction, the tilt with respect to the at least one direction, the orientation with respect to the at least one direction and/or the rotation with respect to the at least one direction.
  • a combination of a first sensor in the form of a camera and a second sensor in the form of one of the detectors defined above has proven to be advantageous.
  • An embodiment of the apparatus is defined by the control instruction comprising a pointing position on the device, a distance between the remote control and the device, a location of the remote control with respect to the device, an acceleration of the remote control in at least one direction, a motion of the remote control in at least one direction, a movement of the remote control in at least one direction, an angle of the remote control with respect to at least one direction, a tilt of the remote control with respect to at least one direction, an orientation of the remote control with respect to at least one direction and/or a rotation of the remote control with respect to at least one direction.
  • Such an apparatus forms, together with the remote control and the device, a gesturing control system.
  • An embodiment of the apparatus is defined by the apparatus providing gesture detection.
  • control information comprising a combination of the first and second sensor information, an improved gesture detection has become possible.
  • An embodiment of the apparatus is defined by the apparatus providing tilt compensation.
  • control information comprising a combination of the first and second sensor information, an improved tilt compensation has become possible.
  • a remote control comprising the apparatus as defined above.
  • a device comprising the apparatus as defined above.
  • a method for converting control information from a remote control into a control instruction for a device, the remote control comprising a first sensor for converting image information into first sensor information and comprising a second sensor for converting geometrical information into second sensor information, the method comprising a step of converting the control information comprising a combination of the first sensor information and the second sensor information into the control instruction.
  • An embodiment of the method is defined by the method further comprising a step of making the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
  • a computer program product is provided for performing the step of the method as defined above.
  • a medium for storing and comprising the computer program product as defined above.
  • a first sensor may provide first sensor information only, and a basic idea could be that a combination of first and second sensors may provide a combination of first and second sensor information.
  • Fig. 1 shows a trajectory of a remote control comprising a first sensor in the form of a camera
  • Fig. 2 shows a trajectory of a remote control comprising a second sensor in the form of an acceleration detector
  • Fig. 3 shows a remote control comprising an apparatus and shows a device
  • Fig. 4 shows a remote control and shows a device comprising an apparatus.
  • a trajectory of a remote control comprising a first sensor in the form of a camera is shown.
  • the device comprises for example a single beacon that is to be detected by the camera or the entire device is to be detected by the camera as a single light point or blob.
  • fast left right movements made by a user are not detected as fully horizontal left right movements by the remote control.
  • a first reason might be that the left right movements made by the user are not precisely horizontal.
  • a second reason might be that the user is not making left right movements but is rotating the remote control (e.g.
  • the remote control such as a handheld around an axis like a line between the handheld and the device when the handheld is kept relatively straight in front of the device) or making circular movements with the remote control (e.g. circles in a plane relatively parallel to a front side of the device).
  • the remote control comprising only the first sensor in the form of the camera cannot distinguish between these two reasons.
  • the remote control comprising only the first sensor in the form of the camera cannot distinguish between a rotation and a circular movement.
  • a trajectory of a remote control comprising a second sensor in the form of an acceleration detector is shown. Fast left right movements made by a user are not detected as fully horizontal left right movements by the remote control at all. The faster the horizontal movements, the more the data from the acceleration detector such as an
  • accelerometer will show a deviation from a horizontal movement. But also for relatively slow movements, the acceleration detector will not show a straight line.
  • a remote control 10 comprising an apparatus 1 and a device 20 are shown.
  • the remote control 10 comprises a first sensor 11 and comprises a second sensor 12.
  • the first sensor 11 converts image information into first sensor information provided to the apparatus 1 and the second sensor 12 converts geometrical information into second sensor information provided to the apparatus 1.
  • the apparatus 1 converts control information comprising a combination of the first sensor information and the second sensor information into a control instruction and supplies it to a transmitter 13.
  • the transmitter 13 transmits the control instruction to a receiver 23 in the device 20.
  • the receiver 23 supplies the control instruction to a device controller 24.
  • one or more controllers may be present. Further alternatively, these one or more controllers may be located in the sensors 11-12 or in the apparatus 1, or the apparatus 1 may form part of such controller(s).
  • the remote control 10 comprises a first sensor 11 and comprises a second sensor 12.
  • the first sensor 11 converts image information into first sensor information provided to a transmitter 13 possibly via a controller 14 and the second sensor 12 converts geometrical information into second sensor information provided to the transmitter 13 possibly via the controller 14.
  • the transmitter 13 transmits the first and second sensor information to a receiver 23 in the device 20.
  • the receiver 23 supplies the first and second sensor information to the apparatus 1.
  • the apparatus 1 converts control information comprising a combination of the first sensor information and the second sensor information into a control instruction and supplies it to a device controller 24.
  • parts of the controller 14 may be located in the sensors 11-12.
  • the device controller 24 may be located in the apparatus 1, or the apparatus 1 may form part of the device controller 24.
  • the apparatus 1 combines the first and second sensor information by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
  • the data from the Fig. 2 could be used to check the data from the Fig. 1.
  • the data from the Fig. 2 could be used to complete the data from the Fig. 1.
  • the correcting in case some of the data from the Fig. 1 is partly incorrect for some position or for some moments in time, the data from the Fig.
  • the data from the Fig. 2 could be used to correct the data from the Fig. 1.
  • the data from the Fig. 2 could be used to overrule the data from the Fig. 1.
  • the first sensor 11 comprises a camera
  • the image information comprises an image showing a single beacon, two or more beacons, a beacon and a non- beacon and/or the device 20 or a part thereof
  • the first sensor information comprises a detection of the single beacon, the two or more beacons, the beacon and the non-beacon and/or the device 20 or the part thereof.
  • the remote control 10 comprising only the first sensor 11 in the form of the camera can distinguish between pure left right movements and left right movements accompanied by some rotation, and between a pure rotation and a circular movement, the combination of the two different sensors 11 and 12 will improve that situation too.
  • the second sensor 12 comprises an acceleration detector, a motion detector, a movement detector, an angle detector, a tilt detector, an orientation detector and/or a rotation detector
  • the geometrical information comprises an acceleration in at least one direction, a motion in at least one direction, a movement in at least one direction, an angle with respect to at least one direction, a tilt with respect to at least one direction, an orientation with respect to at least one direction, and/or a rotation with respect to at least one direction
  • the second sensor information comprises a detection of the acceleration in the at least one direction, the motion in the at least one direction, the movement in the at least one direction, the angle with respect to the at least one direction, the tilt with respect to the at least one direction, the orientation with respect to the at least one direction and/or the rotation with respect to the at least one direction.
  • the control instruction comprises a pointing position on the device 20, a distance between the remote control 10 and the device 20, a location of the remote control 10 with respect to the device 20, an acceleration of the remote control 10 in at least one direction, a motion of the remote control 10 in at least one direction, a movement of the remote control 10 in at least one direction, an angle of the remote control 10 with respect to at least one direction, a tilt of the remote control 10 with respect to at least one direction, an orientation of the remote control 10 with respect to at least one direction and/or a rotation of the remote control 10 with respect to at least one direction.
  • the remote control 10 comprising the first sensor 11 in the form of the camera for example translates a light spot or a blob captured from a beacon or a more precise device detection to for example a cursor position on a display of the device 20.
  • a gesture made by a user who is holding the remote control 10 results in a sequence of light spot samples or blob samples or device detections on the camera sensor. By processing such sequences, a gesture can be detected.
  • Gesture detection based on geometrical data only such as for example acceleration data is relatively unreliable (relatively high probability on false positives and false negatives) due to the combination of gravity and external forces.
  • An acceleration based detection of simple horizontal movements is difficult and may cause latencies in detection.
  • Latencies are typically introduced by filtering sensor data. The filtering is applied to suppress an impact of noise from external forces.
  • geometrical data such as for example acceleration data
  • optical data are combined.
  • the more accurate optical trajectory information from the camera can be fully exploited whereby the acceleration data is used to derive an orientation.
  • This increases the correct detection rate of gestures, where acceleration data assists in optical detection and vice versa.
  • the combination of optical and acceleration based detection also results in a richer set of gestures that can be recognized compared to the "optical only” and “acceleration only” situations. Improvements are also achieved in case tilt-correction is implemented. With the combination of two or more different sensors 11-12, more different gestures can be detected more precisely.
  • an apparatus 1 converts control information comprising a combination of first and second sensor information from a remote control 10 into a control instruction for a device 20.
  • the remote control 10 comprises a first sensor 11 for converting image information into the first sensor information and a second sensor 12 for converting geometrical information into the second sensor information.
  • the apparatus 1 may form part of the remote control 10 or the device 20 or may be located in between. At least two different sensors 11, 12 used in combination in the remote control 10 are a great advantage.
  • the apparatus 1 may make the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
  • the first sensor 11 may comprise a camera
  • the second sensor may comprise a detector for detecting an acceleration, a motion, a movement, an angle, a tilt, an orientation and/or a rotation.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)

Abstract

An apparatus (1) converts control information comprising a combination of first and second sensor information from a remote control (10) into a control instruction for a device (20). The remote control (10) comprises a first sensor (11) for converting image information into the first sensor information and a second sensor (12) for converting geometrical information into the second sensor information. The apparatus (1) may form part of the remote control (10) or the device (20) or may be located in between. At least two different sensors (11, 12) used in combination in the remote control(10) are a great advantage. The apparatus (1) may make the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information. The first sensor (11) may comprise a camera, the second sensor may comprise a detector for detecting an acceleration, a motion, a movement, an angle, a tilt, an orientation and/or a rotation.

Description

Remote control with first and second sensors
FIELD OF THE INVENTION
The invention relates to an apparatus for converting control information from a remote control into a control instruction for a device.
The invention further relates to a remote control comprising the apparatus, to a device comprising the apparatus, to a method for converting control information from a remote control into a control instruction for a device, to a computer program product for performing the step of the method, and to a medium for storing and comprising the computer program product.
Examples of such a device are devices with displays and other devices that are to be controlled remotely.
BACKGROUND OF THE INVENTION
WO 2004 / 047011 and US 2010 / 0157033 disclose pointing devices with cameras.
SUMMARY OF THE INVENTION
It is an object of the invention to provide an improved apparatus. Further objects of the invention are to provide a remote control comprising the improved apparatus, a device comprising the improved apparatus, an improved method, a computer program product, and a medium.
According to a first aspect, an apparatus is provided for converting control information from a remote control into a control instruction for a device, the remote control comprising a first sensor for converting image information into first sensor information and comprising a second sensor for converting geometrical information into second sensor information, the control information comprising a combination of the first sensor information and the second sensor information.
The apparatus may form part of the remote control or may form part of the device or may be located between the remote control and the device. By having provided the remote control with at least two different sensors, and by letting the apparatus convert the control information from the remote control into the control instruction for the device, which control information comprises the combination of the first and second sensor information from the different sensors, an improved apparatus has been created. Such an improved apparatus offers more possibilities owing to the fact that different sensors are used in combination.
An embodiment of the apparatus is defined by the apparatus being arranged to make the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
According to a first possibility, the control information comprises the first sensor information that has been checked by having used the second sensor information. According to a second possibility, the control information comprises the first sensor information that has been completed by having used the second sensor information.
According to a third possibility, the control information comprises the first sensor information that has been corrected by having used the second sensor information. According to a fourth possibility, the control information comprises the second sensor information that has overruled the first sensor information.
An embodiment of the apparatus is defined by the first sensor comprising a camera, the image information comprising an image showing a single beacon, two or more beacons, a beacon and a non-beacon and/or the device or a part thereof, and the first sensor information comprising a detection of the single beacon, the two or more beacons, the beacon and the non-beacon and/or the device or the part thereof. By locating one beacon, possibly assisted by a non-beacon, such as a noise source or a reflection, or two or more beacons near the device, via the camera of the remote control a position of the remote control with respect to the device can be determined. Alternatively, the position of the remote control with respect to the device can be determined by recognizing the device of a part thereof via the camera of the remote control.
An embodiment of the apparatus is defined by the second sensor comprising an acceleration detector, a motion detector, a movement detector, an angle detector, a tilt detector, an orientation detector and/or a rotation detector, the geometrical information comprising an acceleration in at least one direction, a motion in at least one direction, a movement in at least one direction, an angle with respect to at least one direction, a tilt with respect to at least one direction, an orientation with respect to at least one direction and/or a rotation with respect to at least one direction, and the second sensor information comprising a detection of the acceleration in the at least one direction, the motion in the at least one direction, the movement in the at least one direction, the angle with respect to the at least one direction, the tilt with respect to the at least one direction, the orientation with respect to the at least one direction and/or the rotation with respect to the at least one direction. A combination of a first sensor in the form of a camera and a second sensor in the form of one of the detectors defined above has proven to be advantageous.
An embodiment of the apparatus is defined by the control instruction comprising a pointing position on the device, a distance between the remote control and the device, a location of the remote control with respect to the device, an acceleration of the remote control in at least one direction, a motion of the remote control in at least one direction, a movement of the remote control in at least one direction, an angle of the remote control with respect to at least one direction, a tilt of the remote control with respect to at least one direction, an orientation of the remote control with respect to at least one direction and/or a rotation of the remote control with respect to at least one direction. Such an apparatus forms, together with the remote control and the device, a gesturing control system.
An embodiment of the apparatus is defined by the apparatus providing gesture detection. With control information comprising a combination of the first and second sensor information, an improved gesture detection has become possible.
An embodiment of the apparatus is defined by the apparatus providing tilt compensation. With control information comprising a combination of the first and second sensor information, an improved tilt compensation has become possible.
According to a second aspect, a remote control is provided comprising the apparatus as defined above.
According to a third aspect, a device is provided comprising the apparatus as defined above.
According to a fourth aspect, a method is provided for converting control information from a remote control into a control instruction for a device, the remote control comprising a first sensor for converting image information into first sensor information and comprising a second sensor for converting geometrical information into second sensor information, the method comprising a step of converting the control information comprising a combination of the first sensor information and the second sensor information into the control instruction.
An embodiment of the method is defined by the method further comprising a step of making the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information. According to a fifth aspect, a computer program product is provided for performing the step of the method as defined above.
According to a sixth aspect, a medium is provided for storing and comprising the computer program product as defined above.
An insight could be that a first sensor may provide first sensor information only, and a basic idea could be that a combination of first and second sensors may provide a combination of first and second sensor information.
A problem to provide an improved apparatus has been solved. A further advantage could be that more options may become possible.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings:
Fig. 1 shows a trajectory of a remote control comprising a first sensor in the form of a camera,
Fig. 2 shows a trajectory of a remote control comprising a second sensor in the form of an acceleration detector,
Fig. 3 shows a remote control comprising an apparatus and shows a device, and
Fig. 4 shows a remote control and shows a device comprising an apparatus.
DETAILED DESCRIPTION OF EMBODIMENTS
In the Fig. 1, a trajectory of a remote control comprising a first sensor in the form of a camera is shown. The device comprises for example a single beacon that is to be detected by the camera or the entire device is to be detected by the camera as a single light point or blob. Clearly, fast left right movements made by a user are not detected as fully horizontal left right movements by the remote control. A first reason might be that the left right movements made by the user are not precisely horizontal. A second reason might be that the user is not making left right movements but is rotating the remote control (e.g.
rotating the remote control such as a handheld around an axis like a line between the handheld and the device when the handheld is kept relatively straight in front of the device) or making circular movements with the remote control (e.g. circles in a plane relatively parallel to a front side of the device). The remote control comprising only the first sensor in the form of the camera cannot distinguish between these two reasons. Similarly, the remote control comprising only the first sensor in the form of the camera cannot distinguish between a rotation and a circular movement.
In the Fig. 2, a trajectory of a remote control comprising a second sensor in the form of an acceleration detector is shown. Fast left right movements made by a user are not detected as fully horizontal left right movements by the remote control at all. The faster the horizontal movements, the more the data from the acceleration detector such as an
accelerometer will show a deviation from a horizontal movement. But also for relatively slow movements, the acceleration detector will not show a straight line.
In the Fig. 3, a remote control 10 comprising an apparatus 1 and a device 20 are shown. The remote control 10 comprises a first sensor 11 and comprises a second sensor 12. The first sensor 11 converts image information into first sensor information provided to the apparatus 1 and the second sensor 12 converts geometrical information into second sensor information provided to the apparatus 1. The apparatus 1 converts control information comprising a combination of the first sensor information and the second sensor information into a control instruction and supplies it to a transmitter 13. The transmitter 13 transmits the control instruction to a receiver 23 in the device 20. The receiver 23 supplies the control instruction to a device controller 24. Alternatively, between the apparatus 1 and one or more of the sensors 11-12, one or more controllers may be present. Further alternatively, these one or more controllers may be located in the sensors 11-12 or in the apparatus 1, or the apparatus 1 may form part of such controller(s).
In the Fig. 4, a remote control 10 and a device 20 comprising an apparatus 1 are shown. The remote control 10 comprises a first sensor 11 and comprises a second sensor 12. The first sensor 11 converts image information into first sensor information provided to a transmitter 13 possibly via a controller 14 and the second sensor 12 converts geometrical information into second sensor information provided to the transmitter 13 possibly via the controller 14. The transmitter 13 transmits the first and second sensor information to a receiver 23 in the device 20. The receiver 23 supplies the first and second sensor information to the apparatus 1. The apparatus 1 converts control information comprising a combination of the first sensor information and the second sensor information into a control instruction and supplies it to a device controller 24. Alternatively, parts of the controller 14 may be located in the sensors 11-12. Further alternatively, the device controller 24 may be located in the apparatus 1, or the apparatus 1 may form part of the device controller 24. Preferably, the apparatus 1 combines the first and second sensor information by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information. In view of the checking, the data from the Fig. 2 could be used to check the data from the Fig. 1. In view of the completing, in case some of the data from the Fig. 1 is missing for some position or for some moments in time, the data from the Fig. 2 could be used to complete the data from the Fig. 1. In view of the correcting, in case some of the data from the Fig. 1 is partly incorrect for some position or for some moments in time, the data from the Fig. 2 could be used to correct the data from the Fig. 1. In view of the overruling, in case the data from the Fig. 1 is fully incorrect for some position or for some moments in time, the data from the Fig. 2 could be used to overrule the data from the Fig. 1.
Preferably, the first sensor 11 comprises a camera, the image information comprises an image showing a single beacon, two or more beacons, a beacon and a non- beacon and/or the device 20 or a part thereof, and the first sensor information comprises a detection of the single beacon, the two or more beacons, the beacon and the non-beacon and/or the device 20 or the part thereof. Here it must be noted that, in a situation with two or more beacons or a combination of a beacon and a non-beacon or a more precise detection of the device or the part thereof, although the remote control 10 comprising only the first sensor 11 in the form of the camera can distinguish between pure left right movements and left right movements accompanied by some rotation, and between a pure rotation and a circular movement, the combination of the two different sensors 11 and 12 will improve that situation too.
Preferably, the second sensor 12 comprises an acceleration detector, a motion detector, a movement detector, an angle detector, a tilt detector, an orientation detector and/or a rotation detector, the geometrical information comprises an acceleration in at least one direction, a motion in at least one direction, a movement in at least one direction, an angle with respect to at least one direction, a tilt with respect to at least one direction, an orientation with respect to at least one direction, and/or a rotation with respect to at least one direction, and the second sensor information comprises a detection of the acceleration in the at least one direction, the motion in the at least one direction, the movement in the at least one direction, the angle with respect to the at least one direction, the tilt with respect to the at least one direction, the orientation with respect to the at least one direction and/or the rotation with respect to the at least one direction. Here it must be noted that, in a situation where the first sensor 11 goes "out-of-reach", the second sensor 12 may compensate for this situation. Preferably, the control instruction comprises a pointing position on the device 20, a distance between the remote control 10 and the device 20, a location of the remote control 10 with respect to the device 20, an acceleration of the remote control 10 in at least one direction, a motion of the remote control 10 in at least one direction, a movement of the remote control 10 in at least one direction, an angle of the remote control 10 with respect to at least one direction, a tilt of the remote control 10 with respect to at least one direction, an orientation of the remote control 10 with respect to at least one direction and/or a rotation of the remote control 10 with respect to at least one direction.
The remote control 10 comprising the first sensor 11 in the form of the camera for example translates a light spot or a blob captured from a beacon or a more precise device detection to for example a cursor position on a display of the device 20. A gesture made by a user who is holding the remote control 10 results in a sequence of light spot samples or blob samples or device detections on the camera sensor. By processing such sequences, a gesture can be detected. However, it is difficult, when using camera data only, to properly distinguish between all kinds of gestures, since the camera might not have a good reference for its orientation in space. In a camera based remote control, while using a single reference point (a single beacon), there is no information about the orientation of the remote control, resulting in difficulties in detecting differences between rotations and circular movements. But also in a camera based remote control, while using several reference points (several beacons) or more precise device detections, problems may occur, resulting in difficulties in detecting different gestures.
Gesture detection based on geometrical data only such as for example acceleration data is relatively unreliable (relatively high probability on false positives and false negatives) due to the combination of gravity and external forces. An acceleration based detection of simple horizontal movements is difficult and may cause latencies in detection. Latencies are typically introduced by filtering sensor data. The filtering is applied to suppress an impact of noise from external forces.
By combining two different sensors 11-12, geometrical data, such as for example acceleration data, and optical data are combined. The more accurate optical trajectory information from the camera can be fully exploited whereby the acceleration data is used to derive an orientation. This increases the correct detection rate of gestures, where acceleration data assists in optical detection and vice versa. The combination of optical and acceleration based detection also results in a richer set of gestures that can be recognized compared to the "optical only" and "acceleration only" situations. Improvements are also achieved in case tilt-correction is implemented. With the combination of two or more different sensors 11-12, more different gestures can be detected more precisely.
Summarizing, an apparatus 1 converts control information comprising a combination of first and second sensor information from a remote control 10 into a control instruction for a device 20. The remote control 10 comprises a first sensor 11 for converting image information into the first sensor information and a second sensor 12 for converting geometrical information into the second sensor information. The apparatus 1 may form part of the remote control 10 or the device 20 or may be located in between. At least two different sensors 11, 12 used in combination in the remote control 10 are a great advantage. The apparatus 1 may make the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information. The first sensor 11 may comprise a camera, the second sensor may comprise a detector for detecting an acceleration, a motion, a movement, an angle, a tilt, an orientation and/or a rotation.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. An apparatus (1) for converting control information from a remote control (10) into a control instruction for a device (20), the remote control (10) comprising a first sensor (11) for converting image information into first sensor information and comprising a second sensor (12) for converting geometrical information into second sensor information, the control information comprising a combination of the first sensor information and the second sensor information.
2. The apparatus (1) as defined in claim 1, the apparatus (1) being arranged to make the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
3. The apparatus (1) as defined in claim 1, the first sensor (11) comprising a camera, the image information comprising an image showing a single beacon, two or more beacons, a beacon and a non-beacon and/or the device (20) or a part thereof, and the first sensor information comprising a detection of the single beacon, the two or more beacons, the beacon and the non-beacon and/or the device (20) or the part thereof.
4. The apparatus (1) as defined in claim 1, the second sensor (12) comprising an acceleration detector, a motion detector, a movement detector, an angle detector, a tilt detector, an orientation detector and/or a rotation detector, the geometrical information comprising an acceleration in at least one direction, a motion in at least one direction, a movement in at least one direction, an angle with respect to at least one direction, a tilt with respect to at least one direction, an orientation with respect to at least one direction, and/or a rotation with respect to at least one direction, and the second sensor information comprising a detection of the acceleration in the at least one direction, the motion in the at least one direction, the movement in the at least one direction, the angle with respect to the at least one direction, the tilt with respect to the at least one direction, the orientation with respect to the at least one direction and/or the rotation with respect to the at least one direction.
5. The apparatus (1) as defined in claim 1, the control instruction comprising a pointing position on the device (20), a distance between the remote control (10) and the device (20), a location of the remote control (10) with respect to the device (20), an acceleration of the remote control (10) in at least one direction, a motion of the remote control (10) in at least one direction, a movement of the remote control (10) in at least one direction, an angle of the remote control (10) with respect to at least one direction, a tilt of the remote control (10) with respect to at least one direction, an orientation of the remote control (10) with respect to at least one direction and/or a rotation of the remote control (10) with respect to at least one direction.
6. The apparatus (1) as defined in claim 1, the apparatus (1) providing gesture detection.
7. The apparatus (1) as defined in claim 1, the apparatus (1) providing tilt compensation.
8. A remote control (10) comprising the apparatus (1) as defined in claim 1.
9. A device (20) comprising the apparatus (1) as defined in claim 1.
10. A method for converting control information from a remote control (10) into a control instruction for a device (20), the remote control (10) comprising a first sensor (11) for converting image information into first sensor information and comprising a second sensor (12) for converting geometrical information into second sensor information, the method comprising a step of converting the control information comprising a combination of the first sensor information and the second sensor information into the control instruction.
11. The method as defined in claim 10, the method further comprising a step of making the combination by using the second sensor information for checking, completing, correcting and/or overruling the first sensor information.
12. A computer program product for performing the step of the method as defined in claim 10.
13. A medium for storing and comprising the computer program product as defined in claim 12.
PCT/IB2012/053771 2011-08-02 2012-07-25 Remote control with first and second sensors WO2013017991A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161514109P 2011-08-02 2011-08-02
US61/514,109 2011-08-02

Publications (1)

Publication Number Publication Date
WO2013017991A1 true WO2013017991A1 (en) 2013-02-07

Family

ID=47073480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2012/053771 WO2013017991A1 (en) 2011-08-02 2012-07-25 Remote control with first and second sensors

Country Status (2)

Country Link
TW (1) TW201324248A (en)
WO (1) WO2013017991A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3529750A4 (en) * 2016-10-24 2019-10-16 Snap Inc. Redundant tracking system
US10593116B2 (en) 2016-10-24 2020-03-17 Snap Inc. Augmented reality object manipulation
US11195338B2 (en) 2017-01-09 2021-12-07 Snap Inc. Surface aware lens
US20220044479A1 (en) 2018-11-27 2022-02-10 Snap Inc. Textured mesh building
US11443491B2 (en) 2019-06-28 2022-09-13 Snap Inc. 3D object camera customization system
US11636657B2 (en) 2019-12-19 2023-04-25 Snap Inc. 3D captions with semantic graphical elements
US11715268B2 (en) 2018-08-30 2023-08-01 Snap Inc. Video clip object tracking
US11810220B2 (en) 2019-12-19 2023-11-07 Snap Inc. 3D captions with face tracking

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI601033B (en) * 2014-07-08 2017-10-01 拓連科技股份有限公司 Management methods and systems for movement detection, and related computer program products

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004047011A2 (en) 2002-11-20 2004-06-03 Koninklijke Philips Electronics N.V. User interface system based on pointing device
US20050212766A1 (en) * 2004-03-23 2005-09-29 Reinhardt Albert H M Translation controlled cursor
US20070236451A1 (en) * 2006-04-07 2007-10-11 Microsoft Corporation Camera and Acceleration Based Interface for Presentations
US20080068336A1 (en) * 2006-09-19 2008-03-20 Samsung Electronics Co., Ltd. Input device and method and medium for providing movement information of the input device
WO2010007566A1 (en) * 2008-07-18 2010-01-21 Koninklijke Philips Electronics N.V. Camera device and screen device
US20100157033A1 (en) 2005-08-11 2010-06-24 Koninklijke Philips Electronics, N.V. Method of determining the motion of a pointing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004047011A2 (en) 2002-11-20 2004-06-03 Koninklijke Philips Electronics N.V. User interface system based on pointing device
US20050212766A1 (en) * 2004-03-23 2005-09-29 Reinhardt Albert H M Translation controlled cursor
US20100157033A1 (en) 2005-08-11 2010-06-24 Koninklijke Philips Electronics, N.V. Method of determining the motion of a pointing device
US20070236451A1 (en) * 2006-04-07 2007-10-11 Microsoft Corporation Camera and Acceleration Based Interface for Presentations
US20080068336A1 (en) * 2006-09-19 2008-03-20 Samsung Electronics Co., Ltd. Input device and method and medium for providing movement information of the input device
WO2010007566A1 (en) * 2008-07-18 2010-01-21 Koninklijke Philips Electronics N.V. Camera device and screen device

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10593116B2 (en) 2016-10-24 2020-03-17 Snap Inc. Augmented reality object manipulation
US10692285B2 (en) 2016-10-24 2020-06-23 Snap Inc. Redundant tracking system
US10803664B2 (en) 2016-10-24 2020-10-13 Snap Inc. Redundant tracking system
US12094063B2 (en) 2016-10-24 2024-09-17 Snap Inc. Redundant tracking system
US11481978B2 (en) 2016-10-24 2022-10-25 Snap Inc. Redundant tracking system
US11580700B2 (en) 2016-10-24 2023-02-14 Snap Inc. Augmented reality object manipulation
EP3529750A4 (en) * 2016-10-24 2019-10-16 Snap Inc. Redundant tracking system
US11704878B2 (en) 2017-01-09 2023-07-18 Snap Inc. Surface aware lens
US11195338B2 (en) 2017-01-09 2021-12-07 Snap Inc. Surface aware lens
US12217374B2 (en) 2017-01-09 2025-02-04 Snap Inc. Surface aware lens
US11715268B2 (en) 2018-08-30 2023-08-01 Snap Inc. Video clip object tracking
US11836859B2 (en) 2018-11-27 2023-12-05 Snap Inc. Textured mesh building
US11620791B2 (en) 2018-11-27 2023-04-04 Snap Inc. Rendering 3D captions within real-world environments
US12020377B2 (en) 2018-11-27 2024-06-25 Snap Inc. Textured mesh building
US12106441B2 (en) 2018-11-27 2024-10-01 Snap Inc. Rendering 3D captions within real-world environments
US20220044479A1 (en) 2018-11-27 2022-02-10 Snap Inc. Textured mesh building
US11823341B2 (en) 2019-06-28 2023-11-21 Snap Inc. 3D object camera customization system
US11443491B2 (en) 2019-06-28 2022-09-13 Snap Inc. 3D object camera customization system
US12211159B2 (en) 2019-06-28 2025-01-28 Snap Inc. 3D object camera customization system
US11636657B2 (en) 2019-12-19 2023-04-25 Snap Inc. 3D captions with semantic graphical elements
US11810220B2 (en) 2019-12-19 2023-11-07 Snap Inc. 3D captions with face tracking
US11908093B2 (en) 2019-12-19 2024-02-20 Snap Inc. 3D captions with semantic graphical elements
US12175613B2 (en) 2019-12-19 2024-12-24 Snap Inc. 3D captions with face tracking

Also Published As

Publication number Publication date
TW201324248A (en) 2013-06-16

Similar Documents

Publication Publication Date Title
WO2013017991A1 (en) Remote control with first and second sensors
CN106558026B (en) Deviating user interface
US9134800B2 (en) Gesture input device and gesture input method
EP2817694B1 (en) Navigation for multi-dimensional input
KR20110063075A (en) Gesture input device and gesture recognition method and device using same
US9602806B1 (en) Stereo camera calibration using proximity data
US9692977B2 (en) Method and apparatus for adjusting camera top-down angle for mobile document capture
US10802606B2 (en) Method and device for aligning coordinate of controller or headset with coordinate of binocular system
JP6372487B2 (en) Information processing apparatus, control method, program, and storage medium
US20170176208A1 (en) Method for providing map information and electronic device for supporing the same
CN106558027B (en) Method for estimating deviation error in camera pose
CN103635777B (en) For the structure measurement unit at the edge and corner of following the trail of, measure and mark adjacently situated surfaces
CN105320274A (en) Direct three-dimensional pointing using light tracking and relative position detection
US10388027B2 (en) Detection method, display apparatus, and detection system
CN104270657B (en) A kind of information processing method and electronic equipment
KR101358064B1 (en) Method for remote controlling using user image and system of the same
EP2678847B1 (en) Estimating control feature from remote control with camera
CN104885433B (en) For the method and apparatus of the flexure of sensing apparatus
KR20180106178A (en) Unmanned aerial vehicle, electronic device and control method thereof
JP2012194659A (en) Gesture recognition device, gesture recognition method, and computer program
US20200320729A1 (en) Information processing apparatus, method of information processing, and information processing system
AU2019315032B2 (en) System for object tracking in physical space with aligned reference frames
JP6670682B2 (en) Position detection method and position detection system
KR101695727B1 (en) Position detecting system using stereo vision and position detecting method thereof
US20170199587A1 (en) Method for correcting motion sensor-related errors while interacting with mobile or wearable devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12777946

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12777946

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载