+

WO2012030153A2 - Dispositif d'entrée de type sans contact - Google Patents

Dispositif d'entrée de type sans contact Download PDF

Info

Publication number
WO2012030153A2
WO2012030153A2 PCT/KR2011/006428 KR2011006428W WO2012030153A2 WO 2012030153 A2 WO2012030153 A2 WO 2012030153A2 KR 2011006428 W KR2011006428 W KR 2011006428W WO 2012030153 A2 WO2012030153 A2 WO 2012030153A2
Authority
WO
WIPO (PCT)
Prior art keywords
signature
information
virtual
keypad
area
Prior art date
Application number
PCT/KR2011/006428
Other languages
English (en)
Korean (ko)
Other versions
WO2012030153A3 (fr
Inventor
유병문
황두성
Original Assignee
주식회사 엘앤와이비젼
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020100085878A external-priority patent/KR101036452B1/ko
Priority claimed from KR1020100111341A external-priority patent/KR101216537B1/ko
Application filed by 주식회사 엘앤와이비젼 filed Critical 주식회사 엘앤와이비젼
Publication of WO2012030153A2 publication Critical patent/WO2012030153A2/fr
Publication of WO2012030153A3 publication Critical patent/WO2012030153A3/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to a contactless input device, and more particularly, when a user's finger touches an invisible virtual input surface or signs in a virtual input space area, the user touches an actual physical display device to make an input.
  • the key value is input or the signature is displayed on the display device, it is possible to prevent personal information exposure such as disease transmission or fingerprint due to contact, and to use for signature inquiry during electronic payment by using signature characteristic information for each user. It is about.
  • Input devices such as touch screens, keyboards, keypads, mice, push buttons, light pens, digital tablets, and trackballs, which are widely used at present, are a touch type in which a user inputs data through direct physical contact with the input device.
  • Keypad related to the present invention is used in various fields, such as digital door lock (ATM), automatic teller machine (ATM), credit authorization terminal (Credit Authorization Terminal), a button type and a touch screen using a physical button ( Touch screen) is used and all are touch type.
  • ATM digital door lock
  • ATM automatic teller machine
  • Credit Authorization Terminal Credit authorization terminal
  • button type a touch screen using a physical button ( Touch screen) is used and all are touch type.
  • Such a touch keypad 1) As a user enters data through physical contact with the keypad serves as a medium for transmitting diseases (germs) caused by indirect contact in places where many mobile customers such as hospitals, banks, marts, restaurants, etc. , 2) there is a risk of exposing information such as a password due to wear due to frequent mechanical contact, and 3) there is a problem of exposing personal information such as fingerprints of a user to the input device.
  • Korean Patent No. 10-0894544 projects an optical plane including a keypad pattern in the air, and when a finger is positioned at a specific position of the projected optical plane, the position of the finger using the IR light source unit and the IR receiver unit It is possible to use a small and semi-permanent use without a mechanical key input device by detecting the key input is made, and a technology that can be applied to a door lock (digital home appliance) keypad, etc. is disclosed, Figure 1 A schematic diagram is disclosed.
  • the space touch screen device includes a pattern projection unit 10, an IR light source unit 20, an IR sensing unit 30, a signal processing unit 40, and an optical plane 50. .
  • the pattern projection unit 10 generates a character or an image and displays it in the air, and the optical plane 50 is an image projected in the air through the pattern projection unit 10 to receive input from a user. It may be a keypad, a still image, or the like.
  • the IR light source unit 20 irradiates an IR optical signal to detect a touch signal provided from a user in the optical plane 50, and independently modulates IR light for each channel to correspond to each input button displayed on the optical plane 50.
  • the signal is generated and provided to the optical plane 50, thereby maintaining the optical plane 50 in an input ready state.
  • the IR detector 30 amplifies and demodulates and demodulates the reflected IR optical signal generated by touching a hollow keypad and an image provided on the optical plane 50 to the signal processor 40, and the signal processor 40 transmits the IR signal.
  • the IR optical signal transmitted from the sensing unit 30 is calculated and processed to find an input button touched by the user, and the output signal for the input signal is transmitted to the pattern projection unit 10.
  • the registered patent can be used semi-permanently because it does not require a mechanical key input device by providing a non-contact keypad, and can be applied to a door lock or a keypad of a digital home appliance, but a projection device for generating an optical plane 50 is required.
  • a projection device for generating an optical plane 50 is required.
  • the manufacturing cost is high and it is difficult to miniaturize.
  • a user inputs a password by touching a button on the keypad or a button on the touch screen.
  • the user inputs his or her signature on the touch screen using a touch pen or a pointed object. That's the way it is.
  • Touch signature pads which use a touch pen or pointed object to enter a user's signature on the touchscreen, allow the user to enter data through physical contact with an input device, such as a touch pen, for hospitals, banks, marts and restaurants.
  • an input device such as a touch pen
  • the present invention has been made to solve the problems of the conventional contact keypad as described above, by using two or more cameras to form an invisible input space area on the upper portion of the physical display device contactless key selection input or signature This makes it possible to prevent personal information such as disease transmission or fingerprints caused by contact.
  • Another object of the present invention is to enable the user to use the identity check when making an electronic payment using the characteristic information when the user signs in the virtual three-dimensional signature space area.
  • a main body having a predefined virtual input space area on one side, a physical display device installed on the other side of the main body portion, toward the virtual input space area
  • Two or more cameras installed in the main body to generate a captured image photographing the position or movement of the touch means in the virtual input space region and the captured image to analyze the position coordinates of the touch means
  • an image processing unit for converting the position coordinates of the touch means into projection position coordinates on the display device according to the set mapping information between the virtual input space area and the display device.
  • a user it is possible for a user to input a key value or a signature without any physical contact with an actual input device, so that the user's disease due to indirect contact at a place where a lot of mobile customers such as hospitals, banks, marts, restaurants, etc. ) It is effective to prevent the transmission.
  • the present invention is made in a non-contact manner, there is no mechanical friction between the touch pen and the touch screen in the keypad or signature pad, thereby extending the mechanical life.
  • the user can use the characteristic information when the user signs in the virtual three-dimensional signature space area so that the user can use it for identity check during electronic payment.
  • FIG. 1 is a configuration diagram of a space touch screen device according to Korean Patent No. 10-0894544.
  • Fig. 2 is a block diagram of a contactless keypad device according to the first embodiment of the present invention.
  • FIG. 3 is a two-dimensional block diagram of the contactless keypad device according to the first embodiment of the present invention.
  • FIG. 4 is a flowchart illustrating a process of performing virtual keypad setting in the contactless keypad device according to the first embodiment.
  • FIG. 5 is a flowchart illustrating a process of performing a method of detecting a contactless key input using the contactless keypad device according to the first embodiment.
  • FIG. 6 is a block diagram illustrating a contactless signature pad according to the second embodiment.
  • FIG. 7 is a perspective view showing a schematic configuration of a contactless signature pad according to a second embodiment.
  • FIG. 8 illustrates a signature pattern on a virtual signature space area and a signature pattern projected onto a display device.
  • FIG. 10 illustrates a principle of generating mapping information using a calibration tool.
  • 11 is a flowchart illustrating a process of generating mapping information using a calibration tool.
  • FIG. 12 is a flowchart illustrating a process of receiving a signature in a contactless manner from the contactless signature pad according to the second embodiment and expressing the signature.
  • a contactless input device a keypad will be described as an example in the first embodiment and a signature pad in the second embodiment, but the technical concept of the present invention is not limited thereto.
  • the non-contact input device that can be made contactless input by detecting the position coordinates in the input space of the.
  • FIG. 2 is a block diagram of a contactless keypad device according to a first embodiment of the present invention
  • Figure 3 is a two-dimensional block diagram of a contactless keypad device according to a first embodiment of the present invention.
  • the contactless keypad device is the body 1, those installed on the body 1, the physical keypad 100, the camera 200,
  • the interface unit 300 includes an image processing unit 400, a storage unit 500, an output unit 600, and a control unit 700.
  • the physical keypad 100 is for visually showing the keypad shape, and may be a physical key input device such as a physical button or an LCD screen. If the visual keypad is provided to the user, the physical keypad 100 may be printed in various forms such as paper with a keypad pattern printed thereon. Means are available.
  • the camera 200 is installed on the upper portion of the physical keypad 100 to capture an image of the upper portion of the physical keypad 100.
  • the two cameras are combined to generate a stereo image.
  • the interface unit 300 is for connection with a server (not shown) connected with the keypad device.
  • the image processor 400 detects the position of the touch means such as the user's finger from the image photographed by the camera 200 and corresponds to the position of the touch means on the virtual keypad area 800 which describes the position of the detected touch means. A function of determining the key 810 is performed.
  • the storage unit 500 includes a virtual keypad area 800 and a virtual keypad area, which are virtual input spaces located in a predetermined area above the physical keypad 100, information about the size of the physical keypad 100 and the keys 110. Position information of the key 810 in the 800 and information for converting the virtual keypad area photographed by the camera 200 into the actual virtual keypad area 800 are stored.
  • the virtual keypad area 800 is a virtually defined area corresponding to the physical keypad 100 on the upper predetermined height h of the physical keypad 100 for detecting a contactless key input of the touch means. It is invisible.
  • the output unit 600 generates sound or light to give a key touch when the touched key 810 on the virtual keypad area 800 is determined by the image processor 500. When it is determined that a specific key 810 on the image is touched, a sound or light is generated to change the color or brightness of the key 110 on the corresponding physical keypad 100 or give a feeling of touch.
  • the control unit 700 is a part in charge of the overall control of the contactless keypad device using the camera according to the present invention.
  • dam 900 may be installed at one side of the main body 1 for stability of the background when the stereo image is acquired.
  • FIG. 4 is a flowchart illustrating a process of performing virtual keypad setting in the contactless keypad device according to the first embodiment.
  • the physical keypad information may include the size of the physical keypad 100, location information of keys, and the like.
  • the camera 200 may be touched simultaneously or continuously by touching a plurality of points of the physical keypad 100 using a calibration tool having a marker formed at a point corresponding to the height of the virtual keypad area.
  • the marker is detected from the stereo image captured by the controller to determine and store information about the virtual keypad area 800 (S410).
  • the spatial correlation of the camera 200 with respect to the physical keypad 100 or the weather keypad area 800 is calculated and stored using the detected marker information (S420).
  • the spatial correlation may be the relative position or relative angle of the camera 200 with respect to the virtual keypad area 800 or the physical keypad 100.
  • the conversion information is generated using the marker information obtained by touching a plurality of points of the physical keypad 100 using the calibration tool and the marker information obtained from the image information acquired from the camera 200.
  • the transform information is a matrix for transforming between the actual coordinate information and the coordinates when viewed by the camera 200, and it is well known to those skilled in the art to generate the transform matrix and its inverse transform matrix, and thus detailed description thereof will be omitted. .
  • FIG. 5 is a flowchart illustrating a process of performing a method of detecting a contactless key input using the contactless keypad device according to the first embodiment.
  • a virtual keypad area 800 located in a predetermined area on the upper part of the physical keypad 100 is defined through the process of FIG. 4, a process of detecting a non-contact key input as shown in FIG. 5 is performed.
  • the image of the upper portion of the physical keypad 100 is continuously photographed by the camera 200 to determine whether the touch means is detected.
  • the touch means is detected, when the touch means is located in the virtual keypad area 800 based on the captured image of the camera 200, the touch area of the touch means is detected (S500).
  • the touch position on the virtual keypad area 800 is determined based on the detected area (S510).
  • the key 810 on the nearest virtual keypad area 800 is found (S520), and the distance between the touch position and the corresponding key 810 is compared and the distance is a reference value. If it is less than or equal to recognize the selected key (S530), by selecting the corresponding key to transmit a key signal to the server, in response to this process to change the color or brightness of the key 110 on the physical keypad 100 or touch Generates sound or light to give a feeling of (S540).
  • FIG. 6 is a block diagram illustrating a contactless signature pad according to a second embodiment
  • FIG. 7 is a perspective view illustrating a schematic configuration of a contactless signature pad according to a second embodiment.
  • the contactless signature pad according to the second embodiment may be largely signed by a user through a virtual signature space area S as a virtual input space, and the input signature may correspond to the display area of the display device. It is composed of a non-contact signature unit (1) for converting the position coordinate to be provided to the display device 2 and a physical display device (2) to visually express the user's signature.
  • the non-contact signature unit 1 includes a main body 40 having an invisible virtual signature space area S formed on one side, two or more cameras 10 installed on one side of the main body 40, and a virtual signature space area S on one side. And an image processor 20 for converting the signature pattern into position coordinates of the display device 2 and an output unit 30 for generating sound or light to give a touch when a signature is input.
  • the two or more cameras 10 are installed in the main body 40 so as to face the virtual signature space area S, and photograph the movement of the signature input means in the virtual signature space area S.
  • the two cameras are combined. Generates a captured image.
  • the main body 40 is formed to surround four surfaces of the virtual signature space area S, and the two or more cameras 10 are installed on one of four surfaces surrounding the virtual signature space area S, so that two or more cameras ( In 10), the image can be obtained stably.
  • the frame 40 is formed so as to surround the four sides of the virtual signature space area S to acquire an image more stably.
  • the virtual signature space area S is not surrounded by the main body 40 and two or more cameras are provided. (10) It is, of course, also possible to be formed on the open space in front.
  • the image processing unit 20 is installed inside the main body 40, and continuously analyzes the captured images generated by the two or more cameras 10 to continuously calculate the position coordinates of the signature input means, and sets a predetermined virtual signature space area. Signature pattern data obtained by converting the position coordinates of the signature input means calculated according to the mapping information between the display apparatuses into the projection position coordinates on the display apparatus 2 is generated.
  • the image processing unit 20 may include mapping means for performing mapping processing between the physical display device 2 and the virtual signature space area S, and signature input means from images acquired from two or more cameras 10. Position determination means for extracting the position and position determination means for determining the position of the signature input means in the virtual signature space area (S) by using the position information of the two or more cameras 10 may be further included.
  • the lamp 31 or the lamp 31 expressing the signature initiation or signature input means detection as light or the sound is input.
  • a speaker 32 for outputting.
  • the display device 2 is a device for displaying the signature pattern received from the image processing unit 20, and the display unit 21 is formed at one side.
  • the display unit 21 may be an LCD, and since the signature is made in a non-contact manner, it does not need to be a touch screen such as a conventional signature pad.
  • the non-contact signature unit 1 and the display device 2 may be formed integrally, but more preferably formed to be separated from each other for convenience of use. In the latter case, it is necessary to combine the wires or wirelessly between the two.
  • the image processing unit 20 and the output unit 30 are illustrated in the main body 40.
  • the image processing unit 20 and the output unit 30 are installed in the display device 2 or installed in a separate host device.
  • the generated captured image may be analyzed by the host device to map the location coordinates and then transmitted to the display device 2.
  • FIG. 8 illustrates a signature pattern on a virtual signature space area and a signature pattern projected onto a display device.
  • the signature image is captured by two or more cameras 10, and the image processing unit ( A signature pattern is generated by detecting a continuous change of position of the finger 60 at 20), and the generated signature pattern is projected on the display unit 21 of the display device 2 and displayed.
  • the thickness of the signature pattern 70 is varied according to the height of the finger 60 (z value in the virtual signature space area S) during the signature operation so as to be similar to the actual signature of the user. It is desirable to.
  • the main body 40 can be installed separately from the display device 2.
  • the virtual signature space area S is defined in the open space formed in the main body 40 through the calibration process using the calibration tool 50, and the virtual signature space area S and the actual physical display device 2 are defined. This is because the mapping information is set in advance. Therefore, according to the present invention, the main body 40 does not need to be installed integrally with the display device 2, and can be installed and used at any position or angle that is convenient for use.
  • FIG. 9 shows an installation example that makes it easy to sign using a reference and a user can easily see the signature pattern displayed on the display device 2.
  • FIG. 10 illustrates a principle of generating mapping information using a calibration tool
  • FIG. 11 is a flowchart illustrating a process of generating mapping information using a calibration tool.
  • the calibration tool 50 used in the present invention has a structure similar to a cube, and a marker 51 is formed at eight vertices of the cube, and the markers 51 are connected to the connecting rod 52. Is connected, the lower end of the calibration tool 50 is formed with a leg portion 53 to be spaced apart from the bottom by a predetermined distance. The height of the leg 53 may be set equal to the distance between the display device 2 and the virtual signature space area.
  • the mapping information generation process is described.
  • the calibration tool 50 of FIG. 10 is installed to contact four vertices of the display apparatus 2 (S600).
  • the cube area corresponding to the virtual signature space area by the leg portion 53 of the calibration tool 50 is positioned to be spaced apart from the display device 2 by a predetermined height.
  • the calibration tool 50 is photographed using two or more cameras 10 (S610), and the virtual signature space area S that is invisible is determined from the photographed images photographed by the two or more cameras 10. (S620). That is, the position information of the eight markers 51 is obtained from the captured image, and the cubic space formed by the eight markers 51 is defined as the virtual signature space area S.
  • two or more cameras for the physical display device 2 or the virtual signature space area S are obtained using marker information obtained from an image captured by the two or more cameras 10.
  • the relative position of 10) is calculated (S630).
  • transformation information is generated by using the position information of the marker 51 of the preset calibration tool 50 and the position information of the marker 51 extracted from the image information acquired from the two or more cameras 10, and then a virtual three-dimensional image.
  • the mapping relationship between the signature space area and the two-dimensional display apparatus 2 is determined (S640).
  • FIG. 12 is a flowchart illustrating a process of receiving a signature in a contactless manner from the contactless signature pad according to the second embodiment and expressing the signature.
  • the image of the virtual signature space area S is continuously photographed by two or more cameras 10 to determine whether a signature input means such as a finger is detected (S700).
  • the touch area of the signature input means is extracted and the position of the signature input means is detected in the touch region of the extracted signature input means (S710). Detecting the position of the signature input means is performed for the purpose of tracking the change in the coordinates of the user's finger to generate the signature pattern.
  • a three-dimensional touch position is calculated using a trigonometry (S720). That is, a vector is created that connects the positions of the two or more cameras 10 and the positions of the signature input means on the virtual signature space area S, and the vectors of the two vectors connecting the positions of the two cameras and the positions of the signature input means. The intersection point is recognized as a touch position on the virtual signature space area S.
  • S720 trigonometry
  • the user's finger position (x, y coordinate) determined in the invisible virtual signature space area S is determined by the projection position on the display device 2 using the mapping information calculated in the setting step. It is determined (S730).
  • the thickness of the signature displayed on the display device 2 is determined by the height position (z coordinate) of the user's finger (S740). That is, as the height position of the user's finger, i.e., the z-coordinate is smaller, the user's finger is located deeper in the virtual signature space area S, so that the signature pattern is expressed thicker. Since the height position of the user's finger is continuously variable during the signature operation, the thickness of the signature pattern is variably expressed accordingly, so that a signature pattern which is very similar to the actual signature of the user can be obtained.
  • a smoothing process is performed as a post-process to prevent discontinuity due to a mathematical error generated in the process of digitizing the continuously projected finger position and thickness information (S750). Smoothing converts the continuously projected position coordinates and thickness information into digital information, and filters the plurality of adjacent points to form a line connecting a plurality of points. It is done by changing it according to the form.
  • the signature pattern data converted according to the mapping information is transmitted to the display device 2 and displayed on the display unit 21 (S760).
  • signature characteristic information is generated (S770), and the generated signature characteristic information is transmitted to the electronic payment system and used for a user identity inquiry (S780).
  • the signature characteristic information may include information such as signature pattern, signature height, signature speed, signature direction information, and the like, and the image processor 20 analyzes the photographed image to calculate such information and generates signature characteristic information therefrom.
  • information about the user's signature that is, signature pattern information or signature characteristic information, should be stored in advance in the electronic payment system.
  • the present invention is an invention that allows the input of the keypad input or the signature pad in a non-contact manner, it is a very useful invention that can be applied to the keypad or signature pad used in banks, card merchants and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention se rapporte à un dispositif d'entrée de type sans contact. Le dispositif d'entrée de type sans contact comprend un corps principal présentant une zone prédéfinie d'espace d'entrée virtuel sur un côté ; un dispositif d'affichage physique installé sur un autre côté du corps principal ; au moins deux appareils photographiques installés sur le corps principal de sorte à être orientés vers la zone d'espace d'entrée virtuel et à générer une image de capture qui capture une position ou un mouvement d'un moyen tactile dans la zone d'espace d'entrée virtuel ; et une unité de traitement d'image pour calculer les coordonnées de position du moyen tactile par analyse de l'image de capture et par conversion des coordonnées de position du moyen tactile en coordonnées de position projetées sur le dispositif d'affichage. Selon la présente invention, sans aucun contact physique avec un dispositif d'entrée, un utilisateur peut saisir une valeur clé ou une signature. Par conséquent, on peut empêcher le développement d'une maladie (des germes) ou d'une infection d'un utilisateur à cause d'un contact indirect dans des endroits très fréquentés tels que des hôpitaux, des banques, des marchés et des restaurants.
PCT/KR2011/006428 2010-09-02 2011-08-31 Dispositif d'entrée de type sans contact WO2012030153A2 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2010-0085878 2010-09-02
KR1020100085878A KR101036452B1 (ko) 2010-09-02 2010-09-02 비접촉식 키패드 및 그 구현방법
KR10-2010-0111341 2010-11-10
KR1020100111341A KR101216537B1 (ko) 2010-11-10 2010-11-10 비접촉식 서명 패드

Publications (2)

Publication Number Publication Date
WO2012030153A2 true WO2012030153A2 (fr) 2012-03-08
WO2012030153A3 WO2012030153A3 (fr) 2012-05-03

Family

ID=45773389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/006428 WO2012030153A2 (fr) 2010-09-02 2011-08-31 Dispositif d'entrée de type sans contact

Country Status (1)

Country Link
WO (1) WO2012030153A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113908304A (zh) * 2021-09-29 2022-01-11 安徽省东超科技有限公司 自助终端设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020087938A (ko) * 2000-12-27 2002-11-23 엔티티 도꼬모 인코퍼레이티드 수기 데이터 입력 장치 및 방법 및 개인 인증 장치 및방법
CN100489881C (zh) * 2001-01-08 2009-05-20 Vkb有限公司 数据输入装置和数据输入方法
PT1573498E (pt) * 2002-11-20 2012-03-22 Koninkl Philips Electronics Nv Sistema de interface de utilizador com base num dispositivo de apontador
KR100929162B1 (ko) * 2009-04-20 2009-12-01 (주)디스트릭트홀딩스 제스쳐를 이용한 인터렉티브 홀로그램 정보 서비스 장치 및 방법

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113908304A (zh) * 2021-09-29 2022-01-11 安徽省东超科技有限公司 自助终端设备

Also Published As

Publication number Publication date
WO2012030153A3 (fr) 2012-05-03

Similar Documents

Publication Publication Date Title
WO2013009040A2 (fr) Dispositif et procédé de télémanipulation utilisant un contact virtuel d'un dispositif électronique modélisé en trois dimensions
WO2011093538A1 (fr) Appareil d'analyse d'iris utilisant une caméra grand-angulaire pour identifier un sujet, et procédé associé
US20050249386A1 (en) Pointing device having fingerprint image recognition function, fingerprint image recognition and pointing method, and method for providing portable terminal service using thereof
WO2014051231A1 (fr) Dispositif d'affichage et son procédé de commande
WO2009142453A2 (fr) Procédé et appareil pour détecter des entrées à effleurements multiples
JPWO2009139214A1 (ja) 表示装置および制御方法
JP2001155137A (ja) 携帯型電子機器
WO2015126197A1 (fr) Appareil et procédé de commande à distance par toucher virtuel mis en œuvre sur un appareil photo
WO2014038765A1 (fr) Procédé de commande de contenu et dispositif numérique l'utilisant
JP6104143B2 (ja) 機器制御システム、および、機器制御方法
WO2012154001A2 (fr) Procédé de reconnaissance tactile dans un dispositif tactile virtuel qui n'utilise pas de pointeur
CN106446871A (zh) 一种触控装置和触控方法
WO2022121243A1 (fr) Procédé et appareil d'étalonnage, dispositif électronique, support de stockage et produit-programme informatique
WO2023243959A1 (fr) Procédé de prédiction du risque de lésion physique sur la base d'une reconnaissance de posture d'utilisateur et appareil associé
KR100968205B1 (ko) 적외선 카메라 방식의 공간 터치 감지 장치, 방법 및스크린 장치
WO2013133624A1 (fr) Appareil d'interface utilisant une reconnaissance de mouvement, et procédé destiné à commander ce dernier
CN112115748B (zh) 证件图像识别方法、装置、终端及存储介质
WO2022247762A1 (fr) Dispositif électronique, et procédé de déverrouillage par empreinte digitale et appareil de déverrouillage par empreinte digitale associé
WO2022145595A1 (fr) Système et procédé d'étalonnage
TWM364241U (en) Optical sensing type input device
WO2012030153A2 (fr) Dispositif d'entrée de type sans contact
KR20090060698A (ko) 가상 멀티 터치 스크린을 이용한 인터페이스 장치 및 그제어 방법
KR100985197B1 (ko) 강사 추적 카메라와 전자 칠판을 이용한 수업 분석 시스템
EP3172640A1 (fr) Dispositif d'affichage et son procédé de commande
WO2017026834A1 (fr) Procédé de génération et programme de génération de vidéo réactive

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11822128

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11822128

Country of ref document: EP

Kind code of ref document: A2

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载