+

WO2013046365A1 - Dispositif de guidage, dispositif d'acquisition d'informations biométriques et dispositif d'enregistrement - Google Patents

Dispositif de guidage, dispositif d'acquisition d'informations biométriques et dispositif d'enregistrement Download PDF

Info

Publication number
WO2013046365A1
WO2013046365A1 PCT/JP2011/072221 JP2011072221W WO2013046365A1 WO 2013046365 A1 WO2013046365 A1 WO 2013046365A1 JP 2011072221 W JP2011072221 W JP 2011072221W WO 2013046365 A1 WO2013046365 A1 WO 2013046365A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
unit
pressure applied
detection unit
detection
Prior art date
Application number
PCT/JP2011/072221
Other languages
English (en)
Japanese (ja)
Inventor
英夫 鎌田
彰孝 皆川
東浦 康之
健太郎 鎹
克美 井出
Original Assignee
富士通フロンテック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通フロンテック株式会社 filed Critical 富士通フロンテック株式会社
Priority to PCT/JP2011/072221 priority Critical patent/WO2013046365A1/fr
Priority to JP2013535716A priority patent/JP5820483B2/ja
Publication of WO2013046365A1 publication Critical patent/WO2013046365A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/63Static or dynamic means for assisting the user to position a body part for biometric acquisition by static guides
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns
    • G06V40/145Sensors therefor

Definitions

  • the present invention relates to a guidance device, a biological information acquisition device, and a registration device.
  • the human body includes biological information that can identify an individual, and some of them are used as information for identifying and authenticating the individual.
  • biometric information that can be used for authentication includes fingerprints, eye retinas and irises, faces, blood vessels, and DNA (Deoxyribo Nucleic Acid).
  • biometric authentication is performed by comparing biometric information (registration template) collected during registration with biometric information acquired during authentication.
  • a guide for guiding a living body is a temporary guide for the user, it is not always sufficient for guiding an appropriate posture. For example, there are cases where the user simply touches the guide lightly or presses it firmly against the guide.
  • the present invention has been made in view of the above points, and provides a guidance device, a biological information acquisition device, and a registration device capable of guiding a biological body to a normal position so that biological information can be suitably acquired. Objective.
  • the first contact portion contacts the finger crotch portion between the second finger and the third finger.
  • the second contact portion contacts the finger crotch portion between the third finger and the fourth finger.
  • the first detector detects the pressure applied to the first contact portion.
  • the second detector detects the pressure applied to the second contact portion.
  • the determination unit determines whether or not the hand is in a normal position based on the detection result of the first detection unit and the detection result of the second detection unit.
  • the biometric information acquisition apparatus which guides the hand as a biometric information acquisition object to a normal position, and acquires the said biometric information is a biometric information acquisition part, a 1st contact part, A second contact portion, a first detection portion, a second detection portion, and a control portion are provided.
  • the biometric information acquisition unit acquires the biometric information.
  • the first contact portion contacts the finger crotch portion between the second finger and the third finger.
  • the second contact portion contacts the finger crotch portion between the third finger and the fourth finger.
  • the first detector detects the pressure applied to the first contact portion.
  • the second detector detects the pressure applied to the second contact portion.
  • the control unit acquires the biological information when the hand is in the normal position based on the detection result of the first detection unit and the detection result of the second detection unit.
  • the biometric information acquisition unit acquires the biometric information.
  • the first contact portion contacts the finger crotch portion between the second finger and the third finger.
  • the second contact portion contacts the finger crotch portion between the third finger and the fourth finger.
  • the first detector detects the pressure applied to the first contact portion.
  • the second detector detects the pressure applied to the second contact portion.
  • the control unit acquires and registers the biological information when the hand is in the normal position based on the detection result of the first detection unit and the detection result of the second detection unit.
  • the biological body can be guided to the normal position so that the biological information can be preferably acquired.
  • FIG. 1 is a diagram illustrating a configuration of a registration apparatus according to the first embodiment.
  • the registration device 10 is used in an authentication system that performs authentication using a palm vein.
  • the authentication system is one of systems for recognizing a biometric feature and identifying and authenticating an individual. For example, a bank system or the like authenticates a customer.
  • the authentication system includes a registration device 10, a plurality of automatic depositing devices (authentication devices) (not shown), an authentication server, and a network.
  • the registration device 10 is a device that is provided at a bank window or the like, and performs user template registration according to instructions or operations of an attendant.
  • the registration device 10 registers verification information (template) registered in advance before biometric authentication in association with identification information for identifying an individual.
  • the identification information for identifying an individual is a unique ID (IDentification) assigned to a user directly (for example, a user number) or indirectly (for example, an account number).
  • the collation information registered in advance is feature information obtained by extracting a feature portion from image information using a predetermined feature extraction algorithm, image information or encoded information obtained by encoding feature information, and the like.
  • the registration device 10 includes a processing device 11, a display 12, and a sensor unit 20, and includes a keyboard 13, a mouse 14, an IC (Integrated (Circuit) card reader / writer 15 as necessary.
  • the sensor unit 20 includes a guide unit that guides a living body to a normal position and a sensing unit that acquires biological information from the living body.
  • the sensor unit 20 guides the user's palm to the normal position, images the palm vein guided to the normal position, and outputs a captured image to the processing device 11.
  • the IC card reader / writer 15 reads and writes information on the IC card 16 of the user.
  • the keyboard 13 and the mouse 14 accept input operations.
  • template registration registration information registration
  • a user who requests template registration inputs identification information (for example, a user ID) for identifying the user using the keyboard 13, mouse 14, or IC card reader / writer 15.
  • the registration apparatus 10 guides the template registration to the user by display using the display 12, and requests input of biometric information for template registration.
  • the user inputs a palm vein image by holding the hand over the sensor unit 20.
  • the registration device 10 that has input a palm vein image as biometric information creates verification information from the input information.
  • the registration device 10 records the verification information created in at least one of the storage unit of the processing device 11, the storage unit of an authentication server (not shown), or the storage unit of the user's IC card 16.
  • FIG. 2 is a diagram illustrating an appearance of the automatic depositing apparatus according to the first embodiment.
  • an automatic depositing apparatus 19 that performs customer authentication in a bank system or the like is illustrated as an authentication apparatus.
  • One or more automatic teller machines 19 are installed in ATMs (Automated Teller Machines) corners and ATM booths inside the financial institutions.
  • the automatic depositing device 19 is one of authentication devices that perform biometric authentication when authenticating a user prior to a financial transaction.
  • the automatic depositing apparatus 19 includes an IC card reader / writer 15 and a sensor unit 20.
  • the sensor unit 20 guides the user's palm to the normal position and takes a vein image of the user's palm guided to the normal position.
  • the automated teller machine 19 includes collation information specified from identification information read from a user's IC card (for example, an IC chip built-in cash card) by the IC card reader / writer 15 and a user's biometric acquired from the sensor unit 20. The user is authenticated from the information.
  • the sensor unit 20 is a guidance device that guides the palm of the user to a normal position, and is a biological information acquisition device that acquires biological information.
  • the automatic depositing device 19 is an authentication device that includes a biological information acquisition device.
  • FIG. 3 is a diagram illustrating an appearance of the sensor unit according to the first embodiment.
  • FIG. 4 is a diagram illustrating an example of the arrangement of the pressure sensors of the three-finger support portion according to the first embodiment.
  • the sensor unit 20 includes a guide 21 that guides the user's palm to a normal position and a sensor 30 that acquires a vein image (biological information) of the palm.
  • the sensor unit 20 can handle both the left hand and the right hand, but here, the palm of the left hand will be described as a biological information acquisition target, and will be described in the order of five fingers based on the left hand.
  • the guide 21 has a shape in which a box-shaped housing whose upper surface is opened is supported by a housing support portion 22 and a housing support portion 23 that serve as a base.
  • the guide 21 includes a wrist support unit 25 that supports the wrist of the user, a three-finger support unit 50 that supports the three fingers of the second finger (index finger), the third finger (middle finger), and the fourth finger (ring finger),
  • a first finger / fifth finger support unit 24 that supports one finger (thumbnail), fifth finger (little finger), thumbnail or little finger hill is provided.
  • the wrist support part 25, the three-finger support part 50, and the first finger / fifth finger support part 24 form the upper four sides that open the casing, and favorably support the palm of the user. Moreover, the wrist support part 25 supports a user's wrist in a proper attitude
  • the first finger / fifth finger support portion 24 is a concave shape formed of a curved surface, and can support the first finger or the fifth finger including the thumb hill or the little finger hill.
  • the three-finger support unit 50 includes a three-finger placement surface 56 for placing and supporting three fingers of the second finger, the third finger, and the fourth finger, and the three fingers of the second finger, the third finger, and the fourth finger.
  • a three-finger placement surface 56 for placing and supporting three fingers of the second finger, the third finger, and the fourth finger, and the three fingers of the second finger, the third finger, and the fourth finger.
  • the three-finger placement surface 56 has a pressure sensor 55L at a position corresponding to the finger pad of the fourth finger, a press sensor 55C at a position corresponding to the finger pad of the third finger, and a position corresponding to the finger pad of the second finger. Includes a pressure sensor 55R.
  • the pressure sensor 55L detects the pressure with which the fourth finger is placed on the three-finger placement surface 56.
  • the pressure sensor 55 ⁇ / b> C detects the pressure with which the third finger is placed on the three-finger placement surface 56.
  • the pressure sensor 55 ⁇ / b> R detects the pressure with which the second finger is placed on the three-finger placement surface 56.
  • the pressure sensor 55 (55L, 55C, 55R) can use a pressure sensitive element.
  • the finger crotch guide 53L contacts the finger crotch portion of the third finger and the fourth finger, guides the third finger and the fourth finger to the three-finger placement surface 56, and guides the front and rear positions of the palm.
  • the finger crotch guide 53R contacts the finger crotch portion of the second finger and the third finger, guides the second finger and the third finger to the three-finger placement surface 56, and guides the front and rear positions of the palm.
  • the finger crotch guide 53L includes a pressure sensor 54L at a position where the finger crotch portion of the third finger and the fourth finger abut.
  • the pressure sensor 54L detects a pressure at which the finger crotch portions of the third finger and the fourth finger contact the finger crotch guide 53L.
  • the finger crotch guide 53R includes a pressure sensor 54R at a position where the finger crotch portion of the second finger and the third finger abut.
  • the pressure sensor 54R detects a pressure at which the finger crotch portions of the second finger and the third finger contact the finger crotch guide 53R.
  • the pressure sensor 54 (54L, 54R) can use a pressure sensitive element.
  • the finger crotch guide 53 (53L, 53R) includes a finger separation rib 51 that separates each finger, and a finger side guide 52 that abuts the immediate part of each finger on both sides of the finger separation rib and guides each finger to a normal position.
  • the sensor unit 20 includes a sensor 30 on the bottom surface of a box-shaped housing.
  • the sensor 30 includes an image sensor (for example, a complementary metal oxide semiconductor (CMOS) sensor, a charge coupled device (CCD) sensor), a condenser lens, and a plurality of near infrared light emitting elements (LEDs) that irradiate a subject. : Light Emitting Diode).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • LEDs near infrared light emitting elements
  • FIG. 5 is a diagram illustrating an appearance when a hand is placed on the sensor unit of the first embodiment.
  • the sensor unit 20 guides the palm of the user's hand 2 to the normal position by the guide 21 and acquires a vein image of the palm.
  • the user passes the finger crotch guide 53L between the third finger and the fourth finger, and passes the finger crotch guide 53R between the second finger and the third finger, so that the hand 2 moves in the left-right direction (along the left-right axis).
  • Direction
  • the user abuts the finger crotch portion of the third finger and the fourth finger on the finger crotch guide 53L with an appropriate pressure, and the finger crotch portion of the second finger and the third finger on the finger crotch guide 53R. Is brought into contact with the front and rear in the front-rear direction (the direction along the front-rear axis). For example, when the pressure with which the finger crotch portion comes into contact with the finger crotch guide 53 is too small, it can be estimated that the hand 2 is in an inappropriate position with respect to the finger crotch guide 53.
  • the hand 2 When the pressure at which the finger crotch portion comes into contact with the finger crotch guide 53 is excessive, it can be estimated that the hand 2 is pressed too strongly against the finger crotch guide 53. In this case, when acquiring a vein image as biological information, the blood circulation of the hand 2 may be inhibited (congestion due to blood vessel pressing), and biological information may not be acquired appropriately. Or when acquiring a vein image as biometric information, the hand 2 may deform
  • the user applies a pressure for bringing the third and fourth finger crotch portions into contact with the finger crotch guide 53L, and a pressure for bringing the second and third finger crotch portions into contact with the finger crotch guide 53R.
  • the user contacts the finger pad of the fourth finger with the pressure sensor 55L with an appropriate pressure, contacts the finger pad of the third finger with the pressure sensor 55C with an appropriate pressure, and presses the finger pad of the second finger.
  • the sensor 55R By contacting the sensor 55R with an appropriate pressure, pitching (rotation on the left and right axes) is eliminated and the sensor 55R is guided to the normal position. For example, when the pressure with which the finger pad portion of each finger comes into contact with the pressure sensor 55 is too small, it can be estimated that the hand 2 has the fingertip facing upward. When the pressure with which the finger pad portion of each finger comes into contact with the pressure sensor 55 is excessive, it can be estimated that the hand 2 has the fingertip pointing downward.
  • the user makes the rolling (on the front and rear axes) by making the pressure at which the finger pad portion of the fourth finger contacts the pressure sensor 55L and the pressure at which the finger pad portion of the second finger contacts the pressure sensor 55R are substantially the same. ) Is guided to the normal position.
  • FIG. 6 is a diagram illustrating a configuration of the sensor unit according to the first embodiment.
  • the sensor unit (biological information acquisition device) 20 includes a sensing unit 60 and a guide unit (guide device) 70.
  • the sensing unit 60 captures a vein image of the palm and transmits the captured data to the processing device 11.
  • the sensing unit 60 includes a storage unit 61, a photographing unit 62, a control unit 63, and a communication unit 64.
  • the control unit 63 controls each processing unit in an integrated manner.
  • the imaging unit (sensor 30) 62 acquires image information from a living body that is a subject.
  • the storage unit 61 temporarily stores the image information acquired by the imaging unit 62.
  • the communication unit 64 communicates with the processing device 11 and the guide unit 70.
  • the photographing unit 62 photographs near-infrared reflected light from a living body (palm) as a subject. Since hemoglobin in the red blood cells flowing in the veins has lost oxygen, this hemoglobin (reduced hemoglobin) has a property of absorbing near infrared rays in the vicinity of 700 nm to 1000 nm. Therefore, when near infrared rays are applied to the palm, only a portion where the vein is present is less reflected, and the position of the vein can be recognized by the intensity of reflected light of the near infrared ray.
  • the photographed image by the photographing unit 62 is an achromatic image although it is easy to extract characteristic information by using a specific light source.
  • the guide unit 70 includes a communication unit 71, a control unit 72, press sensors 54L and 54R, press sensors 55L, 55C and 55R, LEDs 57L and 57R, and LEDs 58L, 58C and 58R.
  • the control unit 72 comprehensively controls each processing unit.
  • the communication unit 71 communicates with the sensing unit 60.
  • the pressure sensors 54 ⁇ / b> L and 54 ⁇ / b> R detect the pressure of the finger crotch portion that contacts the finger crotch guide 53.
  • the pressure sensors 55L, 55C, 55R detect the pressure on the finger pad of each finger placed on the three-finger placement surface 56.
  • the LED 57L notifies the magnitude of the pressure detected by the pressure sensor 54L in a display manner.
  • the LED 57R notifies the magnitude of the pressure detected by the pressing sensor 54R in a display manner.
  • the LED 58L notifies the magnitude of the pressure detected by the pressing sensor 55L in a display manner.
  • the LED 58C notifies the magnitude of the pressure detected by the pressing sensor 55C in a display manner.
  • the LED 58R notifies the magnitude of the pressure detected by the pressing sensor 55R in a display manner.
  • FIG. 7 is a diagram illustrating a configuration of the registration apparatus according to the first embodiment.
  • the registration device 10 includes a control unit 80, a storage unit 81, a notification unit 82, a communication unit 83, an image input unit 84, a feature amount evaluation unit 85, a reacquisition determination unit 86, and a template registration unit 87. Is provided.
  • the control unit 80 comprehensively controls each processing unit.
  • the storage unit 81 stores and holds image information acquired from the sensor unit 20, various databases, and the like.
  • the notification unit 82 generates a required display message such as guidance for the operation of holding the palm of the hand over the sensor unit 20 to the user, notification of success or failure of template registration, and displays the generated display message on the display 12. Further, the notification unit 82 generates a required voice message and outputs the voice from a speaker (not shown).
  • the communication unit 83 performs communication with the sensor unit 20, communication with the IC chip built in the IC card reader / writer 15, and communication with other computers.
  • the image input unit 84 inputs a captured image of the living body from the sensor unit 20. More specifically, when the captured image is input, the image input unit 84 removes the background from the captured image and extracts the subject. The image input unit 84 determines whether or not the extracted subject is a palm. If it is determined that the subject is not a palm, the image input unit 84 again inputs a captured image of the living body from the sensor unit 20.
  • the feature amount evaluation unit 85 evaluates the feature amount included in the biological information from the captured image. Specifically, the feature amount evaluating unit 85 evaluates the vein pattern in the palm image or the amount of feature information included in the vein pattern.
  • the feature information includes, for example, feature points (vein end points and branch points) included in the vein pattern, the number of veins connecting the feature points and neighboring feature points with a straight line, and a small image centered on the feature points.
  • the amount of feature information is a value obtained by quantifying the number and quality of feature information using a predetermined evaluation algorithm. The greater the amount of feature information, the more stable the authentication accuracy, and the smaller the amount of feature information, the more unstable the authentication accuracy.
  • the amount of feature information can be the number of feature points, or the total number of veins that intersect and intersect feature points and neighboring feature points.
  • the feature amount evaluation by the feature amount evaluation unit 85 may be an evaluation value (for example, a numerical value such as 100 or 200), or a feature amount evaluated by a predetermined threshold (for example, “multiple” , “Ordinary”, “low”, etc.).
  • the reacquisition determination unit 86 determines whether to reacquire biometric information based on the evaluation of the feature amount. The reacquisition determination unit 86 determines that reacquisition of biometric information, that is, re-input of a captured image, is performed when the evaluation of the feature amount is not sufficient for the acquired biometric information.
  • the template registration unit 87 processes the extracted image information as a registration template, and records (registers) the registration template in the storage unit of the processing device 11, the storage unit of the authentication server, or the storage unit of the user's IC card 16. To do.
  • FIG. 8 is a diagram illustrating a hardware configuration example of the registration apparatus according to the first embodiment.
  • the registration device 10 includes a processing device 11, a display 12, a keyboard 13, a mouse 14, a sensor unit 20, and an IC card reader / writer 15.
  • the entire processing apparatus 11 is controlled by a CPU (Central Processing Unit) 101.
  • a RAM Random Access Memory
  • HDD Hard Disk Drive
  • a communication interface 104 a graphic processing device 105, and an input / output interface 106 are connected to the CPU 101 via a bus 107.
  • the RAM 102 temporarily stores at least part of an OS (Operating System) program and application programs to be executed by the CPU 101.
  • the RAM 102 stores various data necessary for processing by the CPU 101.
  • the HDD 103 stores an OS and application programs.
  • a display 12 is connected to the graphic processing device 105.
  • the graphic processing device 105 displays an image on the screen of the display 12 in accordance with a command from the CPU 101.
  • the input / output interface 106 is connected to the keyboard 13, the mouse 14, the sensor unit 20, and the IC card reader / writer 15.
  • the input / output interface 106 can be connected to a portable recording medium interface that can write information to the portable recording medium 110 and read information from the portable recording medium 110.
  • the input / output interface 106 transmits signals sent from the keyboard 13, mouse 14, sensor unit 20, IC card reader / writer 15, and portable recording medium interface to the CPU 101 via the bus 107.
  • the communication interface 104 is connected to the network 9.
  • the communication interface 104 transmits / receives data to / from other computers (for example, an authentication server).
  • the processing device 11 can also be configured to include modules each composed of an FPGA (Field Programmable Gate Array), a DSP (Digital Signal Processor), or the like, or can be configured without the CPU 101.
  • each of the processing devices 11 includes a nonvolatile memory (for example, an EEPROM (Electrically Erasable and Programmable Read Only Memory), a flash memory, a flash memory type memory card, etc.), and stores module firmware.
  • the nonvolatile memory can write firmware via the portable recording medium 110 or the communication interface 104. In this way, the processing device 11 can also update the firmware by rewriting the firmware stored in the nonvolatile memory.
  • FIG. 9 is a flowchart of template registration processing according to the first embodiment.
  • the processing device 11 (image input unit 84) requests the sensor unit 20 for an image (captured data) obtained by capturing a vein image of the palm used for template registration.
  • Step S12 The processing device 11 (image input unit 84) determines whether or not shooting data is included in the response from the sensor unit 20.
  • the processing device 11 proceeds to step S13 when shooting data is not included in the response from the sensor unit 20, and proceeds to step S14 when shooting data is included.
  • Step S13 Since the imaging device is not included in the response from the sensor unit 20, the processing device 11 (the notification unit 82) notifies the palm vein image capturing failure and ends the template registration processing. Further, the processing device 11 (re-acquisition determining unit 86) determines re-acquisition of the palm vein image, and if necessary, captures an image (imaging data) of the palm vein image used for template registration in the sensor unit 20. You may request.
  • the processing device 11 extracts vein data from the acquired imaging data.
  • the processing device 11 generates feature data from the extracted vein data.
  • Step S16 The processing device 11 (template registration unit 87) generates and registers a template from the generated feature data, and ends the template registration process.
  • photographing processing executed by the sensing unit 60 will be described with reference to FIG.
  • FIG. 10 is a flowchart of the imaging process according to the first embodiment.
  • the sensing unit 60 receives the palm vein image capturing request from the registration device 10 and executes the capturing process.
  • Step S21 The sensing unit 60 (control unit 63) requests the guide unit 70 to determine whether or not the palm of the user is in the normal position.
  • Step S22 The sensing unit 60 (control unit 63) proceeds to step S25 if there is a determination response from the guide unit 70 as to whether or not the palm of the user is in the normal position, and proceeds to step S23 if there is no determination response. Proceed.
  • Step S23 The sensing unit 60 (control unit 63) proceeds to step S24 if the determination response from the guide unit 70 is not within the predetermined time (when time is up), and returns the determination response if within the predetermined time. Proceed to step S22 to wait.
  • the sensing unit 60 (the control unit 63) responds to the registration device 10 that the palm of the user is not in the normal position or cannot confirm that the user's palm is in the normal position, and ends the photographing process. Note that the sensing unit 60 (the control unit 63) adds a palm vein to perform notification display on the display 12 in addition to a response that the user's palm is not in the normal position or cannot be confirmed to be in the normal position. Image photographing data may be transmitted.
  • Step S25 The sensing unit 60 (control unit 63) determines whether or not the determination response from the guide unit 70 is a response indicating that the user's palm is in the normal position. If the determination response from the guide unit 70 is a response that the user's palm is in the normal position, the sensing unit 60 (the control unit 63) proceeds to step S26, and is not a response that the user's palm is in the normal position. Proceed to step S24.
  • the sensing unit 60 (control unit 63) captures a vein image of the palm using the sensor 30 (imaging unit 62).
  • the sensing unit 60 (control unit 63) terminates the imaging process in response to the registration device 10 that the palm of the user is in the normal position and imaging data of the vein image of the palm.
  • FIG. 11 is a flowchart of normal position determination processing according to the first embodiment.
  • 12 and 13 are diagrams illustrating an example of the pressure sensor output evaluation table according to the first embodiment.
  • the guide unit 70 receives the determination request as to whether or not the user's palm is in the normal position from the sensing unit 60 and executes the normal position determination process.
  • the guide unit 70 acquires the outputs of the press sensors 54L and 54R and the press sensors 55L, 55C and 55R.
  • the guide unit 70 refers to the press sensor output evaluation table 200 and acquires the posture evaluation of the user's hand 2 from the outputs of the press sensors 54L and 54R.
  • the guide unit 70 appropriately guides the hand 2 to the normal position by evaluating “appropriate” when the pressing force detected by the pressing sensors 54L and 54R is in the range of 200 g to 400 g. be able to.
  • both the pressure sensors 54L and 54R are “weak”, the hand 2 is evaluated to be in the retracted position.
  • the pressure sensors 54L and 54R are both “suitable”, the hand 2 is evaluated to be in the normal position.
  • the pressure sensors 54L and 54R are both “strong”, the hand 2 is evaluated as being in an excessively advanced position.
  • the pressing force detected by the pressing sensor 54L is larger than the pressing force detected by the pressing sensor 54R, it is evaluated that the hand 2 is turning right (right rotation on the vertical axis) with respect to the vertical axis.
  • the pressing force detected by the pressing sensor 54R is larger than the pressing force detected by the pressing sensor 54L, it is evaluated that the hand 2 is turning leftward (left-rotating on the vertical axis) with respect to the vertical axis.
  • the guide unit 70 (control unit 72) refers to the press sensor output evaluation table 200, and the light emission mode of the LEDs 57L and 57R according to the evaluation of the press force of the press sensors 54L and 54R from the output of the press sensors 54L and 54R. To get.
  • the light emission modes of the LEDs 57L and 57R are “green light emission” when the pressing force is evaluated as “appropriate”, “blue light emission” when the pressing force is evaluated as “weak”, and the pressing force is “strong”. When it is evaluated as “red light emission”.
  • the guide unit 70 (the control unit 72) refers to the press sensor output evaluation table 210 and acquires the posture evaluation of the user's hand 2 from the outputs of the press sensors 55L, 55C, and 55R. For example, if the pressing force detected by the pressure sensors 55L, 55C, and 55R is within a preset range, it is evaluated as “appropriate”, and “weak” when the detected pressing force is smaller than the set range, and when it is large. Rated as “strong”. More specifically, the guide unit 70 evaluates “appropriate” when the pressing force detected by the pressing sensors 55L, 55C, and 55R is in the range of 200 g to 400 g, so that the hand 2 is suitably placed in the normal position. I can guide you.
  • both of the pressure sensors 55L and 55R are “weak”, or when both the pressure sensors 55L and 55R are “suitable” and the pressure sensor 55C is “weak”, the finger 2 of the hand 2 faces upward ( Rotation on the left and right axis).
  • both of the pressure sensors 55L and 55R are “strong”, or when both the pressure sensors 55L and 55R are “suitable” and the pressure sensor 55C is “strong”, the hand 2 has the fingertip facing downward ( Rotation on the left and right axis).
  • the pressure sensors 55L, 55C, and 55R are all “appropriate”, the hand 2 is evaluated to be in the normal position.
  • control unit 72 refers to the press sensor output evaluation table 210, and the light emission mode of the LEDs 57L and 57R according to the evaluation of the press force of the press sensors 54L and 54R from the output of the press sensors 54L and 54R. To get.
  • the light emission modes of the LEDs 58L, 58C, and 58R are “green light emission” when the pressing force is evaluated as “appropriate”, “blue light emission” when the pressing force is evaluated as “weak”, and the pressing force is “ When it is evaluated as “strong”, it becomes “red light emission”.
  • the light emission modes of the LEDs 57L, 57R, LEDs 58L, 58C, 58R are “lit” when the pressing force is evaluated as “appropriate”, and “long-period flashing” when the pressing force is evaluated as “weak”. When the pressing force is evaluated as “strong”, “short cycle blinking” may be used.
  • the guide unit 70 (control unit 72) is a light emission mode of the LEDs 57L, 57R, LEDs 58L, 58C, 58R acquired with reference to the pressure sensor output evaluation tables 200, 210, and the LEDs 57L, 57R, LEDs 58L, 58C, The 58R output is displayed.
  • Step S ⁇ b> 34 The guide unit 70 (control unit 72) has an evaluation (posture evaluation) acquired with reference to the press sensor output evaluation table 200 and an evaluation (posture evaluation) acquired with reference to the press sensor output evaluation table 210. When both are in the normal position, the process proceeds to step S36, and when either one is not in the normal position, the process proceeds to step S35.
  • Step S35 The guide unit 70 (control unit 72) proceeds to step S31 if it is within a predetermined time from the start of the determination in order to obtain the correct position evaluation of the user's hand 2, and the predetermined time has elapsed (time up). If so, the process proceeds to step S36.
  • Step S36 The guide unit 70 (control unit 72) responds to the determination request from the sensing unit 60 as to whether or not the palm of the user is in the normal position, and ends the normal position determination process.
  • both the evaluation (posture evaluation) acquired with reference to the press sensor output evaluation table 200 and the evaluation (posture evaluation) acquired with reference to the press sensor output evaluation table 210 are in the normal position. In some cases, a correct position response is performed.
  • the guide unit 70 (control unit 72) responds with the evaluation (illegal position) acquired at the time up.
  • the sensor unit 20 can guide the user's hand 2 to the normal position and acquire biometric information suitable for use in biometric authentication.
  • the threshold value set in advance for the pressing force evaluation of the pressing sensors 54L and 54R and the pressing sensors 55L, 55C, and 55R may be set for each user.
  • the threshold value set in advance may be varied depending on gender, age, hand size, and the like.
  • a threshold value when the registration apparatus 10 performs template registration may be stored, and the stored threshold value may be used at the time of authentication.
  • FIG. 14 is a diagram illustrating an example of the arrangement of display areas of the three-finger support unit according to the first embodiment.
  • the sensor unit 20 causes the three-finger support unit 50 to perform notification display in the display mode acquired in step S33 of the normal position determination process.
  • the three-finger support unit 50 has five display areas, display areas 90L, 90C, and 90R, and display areas 91L and 91R. Each display area emits light in three modes of “strong (strong)”, “suitable”, and “weak (weak)”, and notifies the pressing force of the pressure sensor corresponding to each display area.
  • the display area 90L is a display area of the LED 58L corresponding to the press sensor 55L.
  • the display area 90L is located at the left end of the three-finger placement surface 56, and emits light near the fourth finger when the fourth finger is placed on the three-finger placement surface 56. Thereby, the user can easily grasp the notification about the pressing force of the finger pad portion of the fourth finger.
  • the display area 90C is a display area of the LED 58C corresponding to the press sensor 55C.
  • the display area 90 ⁇ / b> C is in the center of the three-finger placement surface 56, and emits light near the third finger when the third finger is placed on the three-finger placement surface 56. Thereby, the user can easily grasp the notification about the pressing force of the finger pad of the third finger.
  • the display area 90R is a display area of the LED 58R corresponding to the press sensor 55R.
  • the display area 90R is located at the right end of the three-finger placement surface 56, and emits light near the second finger when the second finger is placed on the three-finger placement surface 56. Thereby, the user can easily grasp the notification about the pressing force of the finger pad of the second finger.
  • the display area 91L is a display area of the LED 57L corresponding to the press sensor 54L.
  • the display area 91L is in the finger crotch guide 53L, and emits light between the fourth finger and the third finger when the fourth finger and the third finger are placed on the three-finger placement surface 56. Thereby, the user can easily grasp the notification about the pressing force of the finger crotch of the fourth finger and the third finger.
  • the display area 91R is a display area of the LED 57R corresponding to the press sensor 54R.
  • the display area 91R is in the finger crotch guide 53R, and emits light between the third finger and the second finger when the third finger and the second finger are placed on the three-finger placement surface 56. Thereby, the user can easily grasp the notification about the pressing force of the finger crotch portion of the third finger and the second finger.
  • FIG. 15 is a diagram illustrating an example of the arrangement of the pressure sensors of the three-finger support portion according to the second embodiment.
  • the three-finger support portion 92 includes a three-finger placement surface 56 for placing and supporting the three fingers of the second finger, the third finger, and the fourth finger, and the three fingers of the second finger, the third finger, and the fourth finger.
  • a three-finger placement surface 56 for placing and supporting the three fingers of the second finger, the third finger, and the fourth finger, and the three fingers of the second finger, the third finger, and the fourth finger.
  • the three-finger placement surface 56 includes a pressure sensor 55C at a position corresponding to the finger pad portion of the third finger.
  • the guide unit 70 can detect rotation of the hand 2 on the left and right axes based on the detection result of the pressure sensor 55C.
  • the pressure sensor 55C is “weak”, the hand 2 is evaluated as having the fingertip facing upward (rotation on the left and right axes).
  • the pressure sensor 55C is “strong”, the hand 2 is evaluated as having the fingertip pointing downward (rotation on the left and right axes).
  • the pressure sensor 55C is “suitable”, the hand 2 is evaluated to be in the normal position.
  • the pressure sensor is not provided at a position corresponding to the finger pad portion of the fourth finger and the finger pad portion of the third finger, the hand is also supported by supporting the first finger and the fifth finger by the first finger / fifth finger support portion 24. It is possible to guide to the normal position while suppressing the rotation of 2.
  • FIG. 16 is a diagram illustrating an example of the arrangement of the pressure sensors of the three-finger support portion according to the third embodiment.
  • the three-finger support portion 93 has a three-finger placement surface 56 for placing and supporting three fingers of the second finger, the third finger, and the fourth finger, and the three fingers of the second finger, the third finger, and the fourth finger.
  • the three-finger placement surface 56 includes a press sensor 55L at a position corresponding to the finger pad portion of the fourth finger and a press sensor 55R at a position corresponding to the finger pad portion of the second finger.
  • the guide unit 70 can detect rotation of the hand 2 on the left and right axes based on the detection results of the pressure sensors 55L and 55R.
  • the pressure sensors 55L and 55R are both “weak”, the hand 2 is evaluated as having the fingertip facing upward (rotation on the left and right axes).
  • the pressure sensors 55L and 55R are both “strong”, the hand 2 is evaluated as having the fingertip facing downward (rotation on the left and right axes).
  • the pressure sensors 55L and 55R are both “suitable”, the hand 2 is evaluated to be in the normal position.
  • the guide unit 70 can detect the rotation of the hand 2 on the front and rear axes based on the detection results of the pressure sensors 55L and 55R.
  • the pressing force detected by the pressing sensor 55L is larger than the pressing force detected by the pressing sensor 55R, it is evaluated that the hand 2 is rotating counterclockwise with respect to the front-rear axis.
  • the pressing force detected by the pressing sensor 55R is larger than the pressing force detected by the pressing sensor 55L, it is evaluated that the hand 2 is rotating clockwise with respect to the front-rear axis.
  • FIG. 17 is a diagram illustrating an example of the arrangement of sensors of the three-finger support unit according to the fourth embodiment.
  • the three-finger support portion 94 has a three-finger placement surface 56 for placing and supporting three fingers of the second finger, the third finger, and the fourth finger, and the three fingers of the second finger, the third finger, and the fourth finger.
  • the three-finger placement surface 56 has a pressure sensor 55L at a position corresponding to the finger pad of the fourth finger, a press sensor 55R at a position corresponding to the finger pad of the second finger, and a position corresponding to the finger pad of the third finger.
  • the sensor 95 includes an image sensor (for example, a CMOS sensor, a CCD sensor, etc.) that captures a living body, a condensing lens, and a plurality of near-infrared light emitting elements (LEDs) that irradiate a subject.
  • the sensor 95 can image a finger vein.
  • the posture determination based on the detection results of the pressure sensors 55L and 55R by the guide unit 70 can be performed as in the third embodiment.
  • the guide unit 70 can suitably guide the user's hand 2 to the normal position not only for acquiring the palm vein for the user's hand 2 but also for acquiring the finger vein.
  • the guide unit 70 uses the biometric information on the finger vein from the second finger or the fourth finger. You can also get as
  • the guide unit 70 may acquire two or three finger veins as biological information, not limited to one finger.
  • the image sensor and the pressure sensor are configured separately, they may be configured integrally.
  • FIG. 18 is a diagram illustrating an example of pressing detection of the sensor unit according to the fifth embodiment.
  • the three-finger support unit 96 includes two finger crotch guides 97 that guide the three fingers of the second finger, the third finger, and the fourth finger to the respective placement positions.
  • the finger crotch guide 97 includes one that contacts the finger crotch of the third finger and the fourth finger and one that contacts the finger crotch of the second finger and the third finger. Illustration is omitted.
  • the finger crotch guide 97 has a cylindrical shape (boss shape), and one end faces from the three-finger placement surface 56.
  • the other end of the finger crotch guide 97 is pivotally supported below the three-finger placement surface 56 and changes from a forward tilted state (finger crotch guide 97a) to an upright state (finger crotch guide 97b) along the pressing direction of the finger crotch portion. After that, it can be displaced to the rearward tilt state (finger crotch guide 97c).
  • the other end of the finger crotch guide 97 is urged to the forward inclined state (finger crotch guide 97 a) by the urging portion 962.
  • the finger crotch guide 97 can be displaced by being pushed by the finger crotch part against the urging force of the urging part 962.
  • the guide unit 70 can detect the displacement state of the finger crotch guide 97 by a position detection sensor (for example, a photo sensor) 961.
  • the finger crotch guide 97 is in a forward tilt state when the pressing force is excessively low, is standing when the pressing force is in an appropriate range, and is tilted backward when the pressing force is excessive. More specifically, the finger crotch guide 97 is in an upright state when the pressing force is in the range of 200 g to 400 g. Thereby, the user can easily grasp the appropriate range of the pressing force.
  • FIG. 19 is a diagram illustrating an example of a pressing force display of the sensor unit according to the sixth embodiment.
  • size of the pressing force detected with the press sensor was alert
  • 6th Embodiment an indicator display is performed.
  • a display unit 98 is provided at a position corresponding to each press sensor (press sensors 55L, 55C, 55R, press sensors 54L, 54R).
  • the display unit 98 is a belt having a predetermined length, and the central part has an appropriate pressing force range, the right side has an excessive pressing force range, and the left side has an excessive pressing force range.
  • the display unit 98 displays the pressing force 99 at a position corresponding to the detected pressing force. Thereby, the user can easily grasp the appropriate range of the pressing force.
  • the sensor unit 20 can reduce the behavior range and shake of the hand-holding operation for each acquisition opportunity of the biological information and can make the posture uniform, so that the authentication accuracy is stable. Moreover, since the sensor unit 20 can specifically guide the user to the normal position, the user can easily grasp the normal position.
  • the registration device 10 provided with the sensor unit 20 and the authentication device (for example, the automatic depositing device 19) provided with the sensor unit 20 have stable authentication accuracy and improved convenience. If the authentication device is an automatic depositing device 19 or a device that provides services to a large number of users, such as authentication at the time of entering or leaving a room, the waiting time of the users can be reduced, and the service can be improved. .
  • the above processing functions can be realized by a computer.
  • a program describing the processing contents of the functions that each device should have is provided.
  • the program describing the processing contents can be recorded on a computer-readable recording medium (including a portable recording medium).
  • the computer-readable recording medium include a magnetic recording device, an optical disk, a magneto-optical recording medium, and a semiconductor memory.
  • the magnetic recording device include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape.
  • Optical discs include DVD (Digital Versatile Disc), DVD-RAM, CD-ROM, CD-R (Recordable) / RW (ReWritable), and the like.
  • Magneto-optical recording media include MO (Magneto-Optical disk).
  • a portable recording medium such as a DVD or CD-ROM in which the program is recorded is sold. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
  • the computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Input (AREA)

Abstract

L'objectif de l'invention est de guider un être vivant jusqu'à une position normale de façon à pouvoir acquérir avantageusement les informations biométriques. Selon l'invention, une unité de détection est pourvue d'un guide permettant de guider une paume d'un utilisateur jusqu'à une position normale, et d'un capteur permettant d'acquérir un phlébogramme (informations biométriques) de la paume. Le guide est pourvu d'une unité de support à trois doigts (50) permettant de supporter trois doigts qui sont un second doigt (index), un troisième doigt (majeur) et un quatrième doigt (annulaire). L'unité de support à trois doigts (50) est pourvue : d'une surface de placement à trois doigts (56) sur laquelle les trois doigts, à savoir le second doigt, le troisième doigt et le quatrième doigt, sont placés et supportés ; et de deux guides de paniers (53L et 53R) permettant de guider les trois doigts, à savoir le second doigt, le troisième doigt et le quatrième doigt, jusqu'à chaque position de placement respective. La surface de placement à trois doigts (56) est pourvue de capteurs de pression (55L, 55C, 55R) à des positions correspondant à la pulpe de chaque doigt respectif. Les guides de paniers (53L et 53R) sont pourvus de capteurs de pression (54L et 54R) à des positions respectives qui jouxtent les parties des paniers. L'unité de détection procède à une notification des pressions détectées afin de guider la paume de l'utilisateur jusqu'à une position normale.
PCT/JP2011/072221 2011-09-28 2011-09-28 Dispositif de guidage, dispositif d'acquisition d'informations biométriques et dispositif d'enregistrement WO2013046365A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2011/072221 WO2013046365A1 (fr) 2011-09-28 2011-09-28 Dispositif de guidage, dispositif d'acquisition d'informations biométriques et dispositif d'enregistrement
JP2013535716A JP5820483B2 (ja) 2011-09-28 2011-09-28 案内装置、生体情報取得装置、および登録装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/072221 WO2013046365A1 (fr) 2011-09-28 2011-09-28 Dispositif de guidage, dispositif d'acquisition d'informations biométriques et dispositif d'enregistrement

Publications (1)

Publication Number Publication Date
WO2013046365A1 true WO2013046365A1 (fr) 2013-04-04

Family

ID=47994476

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/072221 WO2013046365A1 (fr) 2011-09-28 2011-09-28 Dispositif de guidage, dispositif d'acquisition d'informations biométriques et dispositif d'enregistrement

Country Status (2)

Country Link
JP (1) JP5820483B2 (fr)
WO (1) WO2013046365A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015141007A1 (fr) * 2014-03-20 2015-09-24 日立オムロンターミナルソリューションズ株式会社 Système d'enregistrement/authentification biométrique et procédé d'enregistrement/authentification biométrique
EP3518144A1 (fr) 2018-01-30 2019-07-31 Fujitsu Limited Appareil d'authentification biométrique et procédé d'authentification biométrique

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005253988A (ja) * 2005-03-30 2005-09-22 Hitachi Ltd 預金預払機及びその処理方法
JP2007301058A (ja) * 2006-05-10 2007-11-22 Hitachi Omron Terminal Solutions Corp 個人認証装置
JP2008136729A (ja) * 2006-12-04 2008-06-19 Oki Electric Ind Co Ltd 生体情報取得センサ並びに該センサを有する生体情報取得装置及び自動取引装置
JP2008246011A (ja) * 2007-03-30 2008-10-16 Oki Electric Ind Co Ltd 静脈認証装置
JP2009122729A (ja) * 2007-11-12 2009-06-04 Fujitsu Ltd ガイド装置、撮像装置、撮像システム、ガイド方法
JP2011065659A (ja) * 2010-10-21 2011-03-31 Hitachi Ltd 個人認証装置及び方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005253988A (ja) * 2005-03-30 2005-09-22 Hitachi Ltd 預金預払機及びその処理方法
JP2007301058A (ja) * 2006-05-10 2007-11-22 Hitachi Omron Terminal Solutions Corp 個人認証装置
JP2008136729A (ja) * 2006-12-04 2008-06-19 Oki Electric Ind Co Ltd 生体情報取得センサ並びに該センサを有する生体情報取得装置及び自動取引装置
JP2008246011A (ja) * 2007-03-30 2008-10-16 Oki Electric Ind Co Ltd 静脈認証装置
JP2009122729A (ja) * 2007-11-12 2009-06-04 Fujitsu Ltd ガイド装置、撮像装置、撮像システム、ガイド方法
JP2011065659A (ja) * 2010-10-21 2011-03-31 Hitachi Ltd 個人認証装置及び方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015141007A1 (fr) * 2014-03-20 2015-09-24 日立オムロンターミナルソリューションズ株式会社 Système d'enregistrement/authentification biométrique et procédé d'enregistrement/authentification biométrique
JP6035452B2 (ja) * 2014-03-20 2016-11-30 日立オムロンターミナルソリューションズ株式会社 生体登録・認証システムおよび生体登録・認証方法
EP3518144A1 (fr) 2018-01-30 2019-07-31 Fujitsu Limited Appareil d'authentification biométrique et procédé d'authentification biométrique
US10896250B2 (en) 2018-01-30 2021-01-19 Fujitsu Limited Biometric authentication apparatus and biometric authentication method

Also Published As

Publication number Publication date
JPWO2013046365A1 (ja) 2015-03-26
JP5820483B2 (ja) 2015-11-24

Similar Documents

Publication Publication Date Title
CN103460244B (zh) 生物体认证装置、生物体认证系统以及生物体认证方法
JP5509335B2 (ja) 登録プログラム、登録装置、および登録方法
JP4546169B2 (ja) 手のひら認証用撮像装置
US9245168B2 (en) Authentication apparatus, authentication program, and authentication method
US9197416B2 (en) Verification apparatus, verification program, and verification method
CN110945520B (zh) 生物体认证系统
JP5681786B2 (ja) 生体情報取得装置、および生体情報取得方法
CN1892679A (zh) 生物特征认证方法、认证装置以及血管影像读取装置
CN102523381A (zh) 对无线设备的功能的受控访问
JP2013171325A (ja) 照合対象決定装置、照合対象決定プログラム、および照合対象決定方法
US20130170717A1 (en) Authentication apparatus, authentication program, and method of authentication
KR20150059779A (ko) 정보 처리 장치, 생체 부위 판정 프로그램 및 생체 부위 판정 방법
JP2010055228A (ja) カード処理装置及び処理方法
JP2008004001A (ja) 生体認証システムの登録処理ガイダンス装置及びその方法
KR102014394B1 (ko) 위조 지문에 대한 보안성이 향상된 휴대용 보안인증기의 인증방법
JP5820483B2 (ja) 案内装置、生体情報取得装置、および登録装置
TWI476702B (zh) 使用者辨識系統及辨識使用者的方法
KR102014410B1 (ko) 가짜지문에 대한 보안성이 향상된 휴대용 보안인증기
JP2022097361A (ja) 一定のクレジットカード取引について生体認証を行う支払端末
JP5174199B2 (ja) 血管像読取装置
JP5655155B2 (ja) 情報処理装置、情報処理方法、および情報処理プログラム
JP2014026585A (ja) 生体情報入力装置、生体支持状態判定方法、および生体支持状態判定プログラム
KR20040087295A (ko) 손가락의 위치 또는 압력 보정을 이용한 지문 감지 시스템및 방법
JPWO2012111664A1 (ja) 認証装置、認証プログラム、および認証方法
JP2013148988A (ja) 生体情報処理装置、生体情報処理プログラム、および生体情報処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11873305

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013535716

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11873305

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载