CN106201065B - Method and system for detecting object movement output command - Google Patents
Method and system for detecting object movement output command Download PDFInfo
- Publication number
- CN106201065B CN106201065B CN201610489567.9A CN201610489567A CN106201065B CN 106201065 B CN106201065 B CN 106201065B CN 201610489567 A CN201610489567 A CN 201610489567A CN 106201065 B CN106201065 B CN 106201065B
- Authority
- CN
- China
- Prior art keywords
- movement
- detecting
- rotation
- calculating
- time interval
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Image Analysis (AREA)
Abstract
本发明涉及一种侦测物件移动输出命令的方法,其包括以下步骤,首先,利用一影像撷取装置撷取一物件移动所产生的多张连续影像;之后,根据这些连续影像计算出物件移动所产生的一移动轨迹;接着,根据移动轨迹,输出一对应的控制命令。本发明另提出一种适用上述方法的系统。
The present invention relates to a method for detecting an object movement output command, which includes the following steps. First, an image capturing device is used to capture a plurality of continuous images generated by the movement of an object; then, the object movement is calculated according to the continuous images. A movement track is generated; then, according to the movement track, a corresponding control command is output. The present invention further provides a system suitable for the above method.
Description
The application is a divisional application of an application with the application number of 201210140609.X, the application name of which is 'a method and a system for detecting an object movement output command', which is filed on 8.5.2012.
Technical Field
The present invention relates to a method and system for outputting a command, and more particularly, to a method and system for outputting a command by detecting an object movement.
Background
The existing selection methods for phone numbers or application menus on smart phones, handheld devices or display devices usually perform direct touch selection and confirmation of objects according to patterns displayed on a touch screen, or perform selection or input by using an input device. For example, on a desktop or tablet computer, selection and validation is performed based on a keyboard, mouse, or touch pad. Or, in the non-contact sensing operation, the user selects the touch screen by using the gestures up, down, left and right, and the gesture approach is taken as confirmation.
Generally, in the non-contact sensing operation between a user and a computer, a series of menus are often selected, the most frequently used gesture is up, down, left, or right, but in the normal operation, after the user performs a right (left) gesture, if the user performs the right (left) gesture again, the user will return the hand to the middle position habitually, so in this case, the detecting device will easily generate a gesture erroneously determined as left (right), which will cause the menu to return to the original position, and the user cannot select the desired option at will.
Disclosure of Invention
The present invention is directed to overcoming the disadvantages and drawbacks of the prior art, and providing a method for detecting an output command of object movement, which can control an electronic device efficiently, accurately and easily.
Another objective of the present invention is to provide a system for detecting object movement and outputting commands, which can control an electronic device efficiently, accurately and easily.
Other objects and advantages of the present invention will be further understood from the technical features disclosed in the present invention.
To achieve one or a part of or all of the above or other objects, an embodiment of the present invention provides a method for detecting an object movement output command, which includes the following steps. Firstly, an image capturing device is used to capture a plurality of continuous images generated by the movement of an object. And then, calculating a moving track generated by the movement of the object according to the continuous images. And then, generating a corresponding control command for an electronic device according to the moving track so as to operate an application program on the electronic device.
In an embodiment of the invention, the method for calculating the moving track according to the continuous images includes the following steps. First, the center of gravity position of the object in each image is subtracted from the center of gravity position of the object in the previous image to obtain a motion vector in each time interval. Then, the moving track is calculated according to the moving vector in each time interval.
In an embodiment of the invention, the method for calculating the movement track according to the movement vector in each time interval includes the following steps. First, an object rotation information is detected by using the motion vector in each time interval, wherein the object rotation information includes a rotation plane normal vector, a rotation angle, an angular velocity, a rotation radius or a track length. Then, the object rotation information is utilized to operate the application program on the electronic device. In an embodiment of the invention, the method for operating an application on an electronic device by using object rotation information includes outputting a control command corresponding to a preset angle scale according to a normal vector and an angle of a rotation plane. In an embodiment of the invention, the preset angle may be dynamically scaled according to a track speed or an angular speed. In an embodiment of the invention, when the rotation of the object stops, the object may be decelerated by a reverse acceleration/an angular acceleration according to the average speed/the angular speed detected a period of time before the stop.
In an embodiment of the invention, the method for calculating the movement track generated by the movement of the object according to the continuous images includes detecting the shape change, the size change, the brightness change or the position change of the object image in the images to determine that the movement track is a transverse movement track or a longitudinal movement track.
Another embodiment of the present invention provides a system for detecting object movement and outputting a command, which includes an image capturing device and a processing unit. The image capturing device captures a plurality of continuous images generated by the movement of an object. The processing unit receives the continuous images and calculates a moving track generated by the movement of the object according to the continuous images, and the processing unit is suitable for generating a corresponding control command for an electronic device according to the moving track so as to operate an application program on the electronic device.
In an embodiment of the invention, the system further includes an invisible light source, wherein the invisible light source irradiates the object, and the image capturing device is adapted to detect light from the invisible light source.
In an embodiment of the invention, the object may be a hand of a user.
Based on the above, the method and system for outputting a command by detecting object movement of the present invention can utilize a hand rotation gesture on a plane to operate a menu, such as: when the user rotates clockwise for two circles, the menu can move to the right for two grids, and rotates anticlockwise for one circle to move to the left for one grid, so that the user can perform continuous gesture input in a space without error judgment. In addition, the method for detecting the rotation of the object is mainly to calculate the difference of the motion vectors (such as the vector rotation angle) of the object movement in each time interval, so that compared with the traditional method that a rotation center point is defined by using a plurality of positions of the object after the object movement, the computation amount and the technology of the method and the system are simpler and more effective.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 is a diagram illustrating a system for detecting object movement and outputting a command according to an embodiment of the present invention;
FIG. 2A is a schematic diagram of 8 consecutive images of the user's hand 102 moving;
FIG. 2B is a diagram showing the center of gravity position of each object and the motion vector of the center of gravity movement of the object in FIG. 2A at different times;
FIGS. 3A-3C are schematic diagrams illustrating an operation of an application on an electronic device according to object rotation information;
FIG. 4 is a schematic view of the user's hand of FIG. 1 moving in a vertical direction;
FIG. 5A is a diagram illustrating a phonebook menu opened after the user's hand of FIG. 4 is moved in a vertical direction;
fig. 5B to 5C are schematic diagrams illustrating operation of the phonebook menu according to the object rotation information.
Description of the symbols in the drawings
100 system
102 article
102a moving track
104 position of center of gravity
110 image pick-up device
120 processing unit
130 infrared light source
140 electronic device
200 telephone dialing program
210 dial wheel
212 selected area
220 telephone directory
230 reserved dialing number area
V1-V7 motion vector
Angle between theta 1 and theta 7
Detailed Description
The foregoing and other technical matters, features and effects of the present invention will be apparent from the following detailed description of a preferred embodiment, which is to be read in connection with the accompanying drawings. Directional terms as referred to in the following examples, for example: up, down, left, right, front or rear, etc., are referred to only in the direction of the attached drawings. Accordingly, the directional terminology is used for purposes of illustration and is in no way limiting.
Fig. 1 is a schematic diagram of a system for detecting object movement and outputting a command according to an embodiment of the invention. Referring to fig. 1, the system 100 of the present embodiment may include an image capturing device 110 and a processing unit 120. The image capturing device 110 is suitable for capturing a plurality of continuous images generated by the movement of an object 102. Specifically, the image capturing Device 110 may be selected as one of a Charge-coupled Device (CCD) and a complementary metal-Oxide-Semiconductor (CMOS) sensor, or other detectors capable of detecting light intensity, wherein the image capturing Device 110 of the present embodiment correspondingly uses an infrared image sensor because the present embodiment uses an infrared light source 130 as the illumination object 102. Of course, in other embodiments, the user may select other suitable light sources and corresponding image sensors, which are described above for illustration. In the embodiment, the object 102 in fig. 1 is an exemplary hand 102 of a user, and the following description is provided with the hand of the user, but the invention is not limited thereto, and it may be a user who holds a prop to be sensed, or other parts to be sensed.
Referring to fig. 1, the processing unit 120 is adapted to receive the continuous images generated by the movement of the hand 102 of the user captured by the image capturing device 110 and calculate a movement track 102a generated by the movement of the object 102 according to the continuous images. Then, the processing unit 102 can generate a corresponding control command to an electronic device 140 (e.g., the display device of fig. 1) according to the movement trace 102a to operate an application program (e.g., the selection and control of the phonebook keypad dialing or the phonebook menu in the subsequent paragraphs) on the electronic device 140.
Specifically, the method for detecting the movement output command of the object 102 of the present embodiment may first use the aforementioned image capturing device 110 to capture a plurality of continuous images generated by the movement of the hand 102 of the user, as shown in fig. 2A, wherein fig. 2A is a schematic diagram of superimposing 8 continuous images of the movement of the hand 102 of the user for convenience of description. Then, the object barycentric position 104 of each image is subtracted from the object barycentric position 104 of the previous image to obtain a motion vector V1 in each time interval, as shown in fig. 2B, wherein fig. 2B is a schematic diagram illustrating the barycentric positions of the objects and the motion vectors of the barycentric motion of the objects at different times in fig. 2A. Then, the movement trace 102a of the hand 102 of the user is calculated according to the movement vector V1 in each time interval. In the present embodiment, the center of gravity position 104 of the object 102 can be calculated by using the pixel coordinates and the brightness value of the captured image of the object as weights.
As can be seen from fig. 2B, the included angles θ 1 to θ 7 between the motion vector V1 and the horizontal plane are different at each time interval, and each frame is usually captured according to a fixed frequency, so that the information of angle change in unit time can be obtained, and the information can be used as a method for calculating the motion trajectory 102 a; alternatively, the movement trace 102a can be detected only according to the change of the movement vector V1. In other words, the present embodiment can detect an object rotation information by using the motion vector V1 in each time interval, wherein the object rotation information includes the normal vector of the rotation plane, the rotation angle, the angular velocity, the rotation radius or the track length, etc. since the motion vector V1 can be calculated in unit time. Thus, the application program on the electronic device 140 can be further operated according to the object rotation information.
An example of operating the application on the electronic device 140 by the object rotation information will be described with reference to fig. 3A to 3C.
Referring to fig. 3A, a phone dialing program 200 on the electronic device 140 displays a dial wheel 210 and a phone book 220, wherein the dial wheel 210 can be operated by the object rotation information, as shown in fig. 3A. Specifically, in order to effectively control the rotation speed of the dial wheel, the system 100 can correspondingly output a control command with a preset angle scale according to the normal vector and the angle of the rotation plane generated by the hand 102 of the user, such as: a clockwise (normal-Z) rotation of the XY plane by 180 degrees outputs a right-directional command, while a counterclockwise (normal + Z) rotation by 360 degrees outputs a two-directional left command, but not limited to this example. Therefore, when the rotation angle of the user's counterclockwise rotating hand 102 is 180 degrees, the wheel is rotated one frame, and the number or symbol falling into a selected area 212 is output to a reserved dialing number area 230, as shown in the process of fig. 3A to 3C. At this time, if the user wants to dial the number after inputting the number, the hand 102 can be moved toward the image capturing device 110, and at this time, the shape, size or brightness of the object image is changed although the position of the object image is unchanged, and the processing unit 120 can output a control command accordingly to dial the number. In this embodiment, the user can also rotate the hand 102 clockwise for reverse operation or control.
In addition, in order to make the user have a better operation feel on the control wheel, the preset angle can also be dynamically adjusted according to the track speed or the angular speed, for example: when the hand 102 rotates at a fast speed, the scale is relatively small, for example: four times of commands can be output after rotating 360 degrees; conversely, when the rotation speed is slow, the scale is relatively large, such as: the command is output only once when the rotation is 360 degrees, and the above is only an example and not a limitation. Similarly, when the user's hand 102 stops moving, the system 100 may continue outputting the angle (or command) with an internal inertia for a period of time after the hand trajectory stops, wherein the period of time may depend on the relationship between the average velocity (angular velocity) and the reverse acceleration (angular acceleration), by decelerating with a reverse acceleration/angular acceleration according to the average velocity/angular velocity detected a period of time before the stop.
FIG. 4 is a diagram illustrating the user's hand shown in FIG. 1 moving vertically, and FIG. 5A is a diagram illustrating a phonebook menu opened after the user's hand shown in FIG. 4 moves vertically. Referring to fig. 1, fig. 4 and fig. 5A, the dial wheel 210 in the phone dialing program 200 can be operated by the rotation of the user's hand, and at this time, if the user wants to directly select the information in the phone book 220, the user can translate the hand in the vertical direction to pull out the phone book menu 222, which forms the implementation of fig. 5A, that is, the phone book menu 222 can be pulled up by detecting the vertical translation of the object 102 (the user's hand 102).
Then, when the phone book menu 222 is opened, the user can rotate the hand 102, and the system can select the dialed list by the mechanism of detecting the gesture rotation, as shown in fig. 5B to 5C. Similarly, when the user selects a number to be dialed, the user can move the hand 102 toward the image capturing device 110, and the shape, size, or brightness of the object image changes while the position of the object image is unchanged, so that the processing unit 120 can output a control command to dial the number accordingly.
In summary, the method and system for outputting commands by detecting object movement of the present invention have at least the following advantages. First, a menu can be operated by a hand rotation gesture on a plane, for example: when the user rotates clockwise for two circles, the menu can move to the right for two grids, and rotates anticlockwise for one circle to move to the left for one grid, so that the user can perform continuous gesture input in a space without error judgment. In addition, the method for detecting the rotation of the object in this embodiment is mainly to calculate the difference of the motion vectors (such as the vector rotation angle) of the object motion in each time interval, so that compared with the conventional method that a rotation center point is defined by using a plurality of positions of the object after the object motion, the computation amount and the technique of the method and the system are simpler and more effective.
It should be understood that the above description is only a preferred embodiment of the present invention, and the scope of the present invention should not be limited thereby, and that the invention is intended to cover all the modifications and equivalents of the claims and the contents of the specification. Furthermore, it is not necessary for any embodiment or claim of the invention to address all of the objects, advantages, or features disclosed herein. In addition, the abstract and the title of the invention are provided for assisting the search of patent documents and are not intended to limit the scope of the invention.
Claims (11)
1. A method for detecting an object movement output command, comprising:
capturing a plurality of continuous images generated by the movement of an object by using an image capturing device;
calculating a moving track generated by the movement of the object according to the continuous images, wherein the method for calculating the moving track comprises the steps of detecting object rotation information and operating an application program on an electronic device by using the object rotation information; and
outputting a corresponding control command according to the moving track,
wherein, the method for outputting the corresponding control command comprises outputting the control command correspondingly with a preset angle scale,
when the rotation stop of the object is detected, the application program on the electronic device is decelerated by a reverse acceleration/angular acceleration according to the average speed/angular velocity detected in a period of time before the stop.
2. The method of claim 1, wherein the calculating the movement trajectory from the plurality of sequential images comprises:
subtracting the object gravity center position of each image from the object gravity center position of the previous image to obtain a motion vector in each time interval; and
and calculating the movement track according to the movement vector in each time interval.
3. The method of claim 2, wherein the calculating the motion trajectory according to the motion vector in each time interval comprises:
detecting the object rotation information by using the motion vector in each time interval, wherein the object rotation information comprises a rotation plane normal vector, a rotation angle, an angular velocity, a speed, a rotation radius or a track length.
4. The method of claim 1, wherein before capturing the successive images generated by the object movement with the image capturing device, further comprising:
the object is irradiated by an invisible light source, wherein the image capturing device is suitable for detecting the light of the invisible light source.
5. The method of claim 1, wherein calculating the movement trajectory generated by the object movement according to the successive images comprises:
detecting the shape change, size change, brightness change or position change of the object image in the images to judge that the moving track is a transverse moving track or a longitudinal moving track.
6. The method of claim 1, wherein the object is a user's hand.
7. A system for detecting object movement and outputting a command, comprising:
an image capturing device for capturing a plurality of continuous images generated by the movement of an object; and
a processing unit for receiving the continuous images and calculating a moving track generated by the movement of the object according to the continuous images, wherein the processing unit is suitable for outputting a corresponding control command according to the moving track,
wherein, the method for calculating the moving track comprises detecting an object rotation information and operating an application program on an electronic device by using the object rotation information,
wherein, the method for outputting the corresponding control command comprises outputting the control command correspondingly with a preset angle scale,
when the rotation stop of the object is detected, the application program on the electronic device is decelerated by a reverse acceleration/angular acceleration according to the average speed/angular velocity detected in a period of time before the stop.
8. The system of claim 7, further comprising:
and the invisible light source irradiates the object, wherein the image acquisition device is suitable for detecting the light of the invisible light source.
9. The system of claim 7, wherein the method for the processing unit to calculate the movement track according to the plurality of continuous images comprises:
subtracting the object gravity center position of each image from the object gravity center position of the previous image to obtain a motion vector in each time interval; and
and calculating the movement track according to the movement vector in each time interval.
10. The system of claim 9, wherein the method for calculating the motion trajectory according to the motion vector in each time interval comprises:
detecting the object rotation information by using the motion vector in each time interval, wherein the object rotation information comprises a rotation plane normal vector, a rotation angle, an angular velocity, a speed, a rotation radius or a track length.
11. The system of claim 7, wherein the object is a user's hand.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610489567.9A CN106201065B (en) | 2012-05-08 | 2012-05-08 | Method and system for detecting object movement output command |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201210140609.XA CN103389815B (en) | 2012-05-08 | 2012-05-08 | Method and system for detecting object movement and outputting commands |
| CN201610489567.9A CN106201065B (en) | 2012-05-08 | 2012-05-08 | Method and system for detecting object movement output command |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201210140609.XA Division CN103389815B (en) | 2012-05-08 | 2012-05-08 | Method and system for detecting object movement and outputting commands |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN106201065A CN106201065A (en) | 2016-12-07 |
| CN106201065B true CN106201065B (en) | 2020-03-31 |
Family
ID=49534106
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201610489567.9A Active CN106201065B (en) | 2012-05-08 | 2012-05-08 | Method and system for detecting object movement output command |
| CN201210140609.XA Active CN103389815B (en) | 2012-05-08 | 2012-05-08 | Method and system for detecting object movement and outputting commands |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201210140609.XA Active CN103389815B (en) | 2012-05-08 | 2012-05-08 | Method and system for detecting object movement and outputting commands |
Country Status (1)
| Country | Link |
|---|---|
| CN (2) | CN106201065B (en) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106201065B (en) * | 2012-05-08 | 2020-03-31 | 原相科技股份有限公司 | Method and system for detecting object movement output command |
| TW201743074A (en) | 2016-06-01 | 2017-12-16 | 原相科技股份有限公司 | Measuring device and its operating method |
| CN110986917B (en) * | 2016-06-13 | 2021-10-01 | 原相科技股份有限公司 | Track sensing system and track sensing method therefor |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102270036A (en) * | 2010-06-04 | 2011-12-07 | 宏碁股份有限公司 | Image-based hand motion recognition system and method |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101281422B (en) * | 2007-04-02 | 2012-02-08 | 原相科技股份有限公司 | Apparatus and method for generating three-dimensional information based on object as well as using interactive system |
| JP2009246646A (en) * | 2008-03-31 | 2009-10-22 | Kenwood Corp | Remote control apparatus and setting method |
| CN101650594A (en) * | 2008-08-14 | 2010-02-17 | 宏碁股份有限公司 | Control method based on dynamic image |
| US8477103B2 (en) * | 2008-10-26 | 2013-07-02 | Microsoft Corporation | Multi-touch object inertia simulation |
| TW201027398A (en) * | 2009-01-09 | 2010-07-16 | E Lead Electronic Co Ltd | Method of controlling cursor with multiple and variable speeds through track pad |
| JP2011054171A (en) * | 2009-09-03 | 2011-03-17 | Toshiba Corp | Image display device, image display system, and image display method of image display device |
| TWI489317B (en) * | 2009-12-10 | 2015-06-21 | Tatung Co | Method and system for operating electric apparatus |
| CN102236409A (en) * | 2010-04-30 | 2011-11-09 | 宏碁股份有限公司 | Image-based gesture recognition method and system |
| CN102411426A (en) * | 2011-10-24 | 2012-04-11 | 由田信息技术(上海)有限公司 | Operating method of electronic device |
| CN106201065B (en) * | 2012-05-08 | 2020-03-31 | 原相科技股份有限公司 | Method and system for detecting object movement output command |
-
2012
- 2012-05-08 CN CN201610489567.9A patent/CN106201065B/en active Active
- 2012-05-08 CN CN201210140609.XA patent/CN103389815B/en active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102270036A (en) * | 2010-06-04 | 2011-12-07 | 宏碁股份有限公司 | Image-based hand motion recognition system and method |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106201065A (en) | 2016-12-07 |
| CN103389815A (en) | 2013-11-13 |
| CN103389815B (en) | 2016-08-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5256109B2 (en) | Display device | |
| JP6539816B2 (en) | Multi-modal gesture based interactive system and method using one single sensing system | |
| JP5515067B2 (en) | Operation input device, operation determination method, and program | |
| US20220326784A1 (en) | Method for outputting command by detecting object movement and system thereof | |
| WO2012039140A1 (en) | Operation input apparatus, operation input method, and program | |
| US20130257736A1 (en) | Gesture sensing apparatus, electronic system having gesture input function, and gesture determining method | |
| US20160370883A1 (en) | Information processing apparatus, control method, program, and storage medium | |
| WO2007088942A1 (en) | Input device and its method | |
| CN104169966A (en) | Depth image generation based on light attenuation | |
| CN103797513A (en) | Computer vision based two hand control of content | |
| JP2012069114A (en) | Finger-pointing, gesture based human-machine interface for vehicle | |
| JP2006209563A (en) | Interface device | |
| US20130285904A1 (en) | Computer vision based control of an icon on a display | |
| CN106201065B (en) | Method and system for detecting object movement output command | |
| TWI486815B (en) | Display device, system and method for controlling the display device | |
| JP6418585B2 (en) | Wearable device and operation method of wearable device | |
| JP2011188023A (en) | Information processing unit, method of processing information, and program | |
| KR20130129775A (en) | Method for implementing user interface based on motion detection and apparatus thereof | |
| JP6008904B2 (en) | Display control apparatus, display control method, and program | |
| TWI444875B (en) | Multi-touch input apparatus and its interface method using data fusion of a single touch sensor pad and imaging sensor | |
| CN104679400B (en) | A kind of method and terminal of contactless input information | |
| JP5118663B2 (en) | Information terminal equipment | |
| TW201303745A (en) | Motion detection method and display device | |
| JP2011186537A (en) | Apparatus and method for processing information, and program | |
| JP5550670B2 (en) | Information processing device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |