WO2015098253A1 - Electronic device - Google Patents
Electronic device Download PDFInfo
- Publication number
- WO2015098253A1 WO2015098253A1 PCT/JP2014/077616 JP2014077616W WO2015098253A1 WO 2015098253 A1 WO2015098253 A1 WO 2015098253A1 JP 2014077616 W JP2014077616 W JP 2014077616W WO 2015098253 A1 WO2015098253 A1 WO 2015098253A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- user
- information
- input
- electronic device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/44—Program or device authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/40—User authentication by quorum, i.e. whereby two or more security principals are required
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Definitions
- the present invention relates to an electronic device.
- Patent Document 1 it cannot be said that sufficient consideration has been given to the point of increasing the security of the wearable terminal or the point of increasing the security of other devices using the wearable terminal.
- the present invention has been made in view of the above-described problems, and an object thereof is to provide an electronic device capable of improving security.
- the electronic device of the present invention includes a first input unit that inputs an authentication result of the first part of the user, a second input unit that inputs an authentication result of the second part of the user, and the first and second input units. And a generating unit that generates setting information related to the user's operation based on the input information.
- a transmission unit that transmits the setting information to an external device may be provided. Further, the transmission unit may transmit the setting information by proximity communication or human body communication via the user.
- the electronic apparatus of the present invention may further include a setting unit that performs settings related to the user's operation using the setting information.
- the setting unit may set a restriction on the user's operation based on the input results of the first and second input units.
- the first input unit may input an authentication result related to the user's eyes.
- the second input unit may input an authentication result related to the user's hand.
- the electronic device of the present invention includes an input unit that inputs an authentication result of a part of the user's body, and a transmission unit that transmits input information of the input unit to an external device through human body communication.
- a generation unit is provided that generates setting information related to the user operation based on the authentication result input by the input unit. May be.
- the electronic apparatus includes a storage unit that stores information, an imaging unit that performs imaging, a determination unit that determines whether the imaging unit captures information related to information stored in the storage unit, and the determination A notification unit that notifies the user according to the determination result of the unit.
- information captured using the imaging unit may be stored in the storage unit.
- the storage unit stores text data
- the determination unit performs determination based on text data generated based on imaging data captured by the imaging unit and text data stored in the storage unit. It is good.
- you may provide the transmission part which transmits the determination result of the said determination part to an external apparatus.
- the transmission unit may transmit the information by proximity communication or human body communication via the user.
- the electronic apparatus includes an imaging unit that captures an image regarding character information, a determination unit that determines whether the character information captured by the imaging unit is in communication with an external device, and a determination result of the determination unit And a notification unit for reporting to the user.
- the electronic device of the present invention includes an input unit that inputs an authentication result related to the user's eyes, and a generation unit that generates setting information related to the user's operation based on the input information of the input unit.
- the generation unit may limit the setting information based on an authentication result input by the input unit.
- the electronic device of the present invention includes an input unit that inputs an authentication result related to the user's eyes, and a transmission unit that transmits input information of the input unit to an external device through human body communication.
- the transmission unit may transmit information related to use restriction of the external device by the user.
- the electronic device of the present invention has an effect that security can be improved.
- FIG. 8B is a user authentication table stored in the storage unit of the wristwatch-type device. It is a figure which shows an example. It is a figure which shows the structure of the electronic device system which concerns on 2nd Embodiment. It is a flowchart which shows an example of the process of the spectacles type apparatus which concerns on 2nd Embodiment.
- FIG. 11A and FIG. 11B are diagrams for explaining the processing in step S76 in FIG.
- FIG. 1 is a block diagram showing the configuration of an electronic device system 100 according to the first embodiment.
- the eyeglass-type device 10 includes an imaging unit 11, a display unit 12, an operation unit 13, a microphone 14, a storage unit 15, a communication unit 16, a retina information acquisition unit 18, an authentication unit 19, a control unit 17, and the like. Is provided.
- the eyeglass-type device 10 is shown in a perspective view.
- the spectacle-type device 10 includes a spectacle-type frame 110. The configuration of the eyeglass-type device 10 illustrated in FIG. 1 and not illustrated in FIG. 3 is provided inside the frame 110 or a part of the frame 110.
- the imaging unit 11 includes a lens, an imaging device, an image processing unit, and the like, and captures still images and moving images. As shown in FIG. 3, the imaging unit 11 is provided near the end of the frame 110 (near the user's right eye). For this reason, when the user wears the glasses-type device 10 (the state shown in FIG. 2), it is possible to capture an image in the direction in which the user is facing (looking at).
- the operation unit 13 is a touch pad provided on the frame 110, detects the movement of the user's finger, receives an operation from the user, and transmits the received operation information to the control unit 17.
- the details of the imaging unit 11, the display unit 12, and the operation unit 13 are also disclosed in, for example, US Published Patent No. 2013/0044042.
- the storage unit 15 is a non-volatile semiconductor memory such as a flash memory, for example, image data captured by the imaging unit 11, data used for authentication of the authentication unit 19 (see the user authentication table in FIG. 8A), The display data to be displayed on the display unit 12 and various programs are stored.
- a non-volatile semiconductor memory such as a flash memory
- the communication unit 16 performs human body communication with other devices.
- the communication unit 16 includes an electrode unit 16a that is provided on the frame 110 and can contact a user, and a human body communication unit 16b that performs human body communication using the electrode unit 16a.
- Human body communication includes a current system in which a weak current is passed through the human body and the current is modulated to transmit information, and an electric field system in which information is transmitted by modulating an electric field induced on the surface of the human body. . In this embodiment, it is possible to use either the current method or the electric field method. Note that human body communication is possible not only when the user directly contacts the electrode portion 16a of the eyeglass-type device 10 but also when the electrode portion 16a and the user are not in direct contact.
- the retina information acquisition unit 18 includes an infrared irradiation unit and an infrared light receiving unit.
- the retinal information acquisition unit 18 detects (scans) the path of blood vessels on the retina by irradiating the eyeball with infrared rays from the infrared irradiation unit and receiving the infrared rays reflected by the eyeballs with the infrared light reception unit. Then, the retinal information acquisition unit 18 outputs the detection result to the control unit 17.
- the retinal information acquisition unit 18 is provided on the back side (face side) of the imaging unit 11 in FIG. 2 as an example.
- the control unit 17 comprehensively controls the entire eyeglass-type device 10.
- the control unit 17 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like.
- the control unit 17 performs control of the imaging unit 11 and the retinal information acquisition unit 18, control of the authentication unit 19 and the display unit 12, and the like.
- the wristwatch type device 30 includes a display unit 31, an operation unit 32, a vein information acquisition unit 33, a storage unit 34, an authentication unit 35, a communication unit 36, a control unit 37, and the like.
- the display unit 31 includes an organic EL (Electro-Luminescence) display, a liquid crystal display, and the like, and displays various types of information under instructions from the control unit 37.
- the operation unit 32 includes a touch panel, buttons, and the like, receives user operations, and transmits them to the control unit 37.
- the vein information acquisition unit 33 is provided at a position where the user can face the wrist of the user in a state where the user wears the wristwatch type device 30 (the state shown in FIG. 2).
- the vein information acquisition unit 33 includes an infrared irradiation unit and an infrared light reception unit.
- the vein information acquisition unit 33 detects (scans) the shape of the vein by irradiating the wrist with infrared rays from the infrared irradiation unit and receiving infrared rays reflected by the veins on the wrist with the infrared light reception unit.
- the vein information acquisition unit 33 outputs the detection result to the control unit 37.
- the storage unit 34 is a non-volatile semiconductor memory such as a flash memory, for example, data used for authentication of the authentication unit 35 (see the user authentication table in FIG. 8B), display data displayed on the display unit 31, And various programs are stored.
- the communication unit 36 performs human body communication with other devices. Note that the communication unit 36 may perform near field communication with other devices. In the present embodiment, the communication unit 36 communicates with the communication unit 16 of the eyeglass-type device 10 and the first communication unit 24 of the PC 20 using human body communication.
- the communication unit 36 includes an electrode unit 24a that can be in contact with the user's arm, and a human body communication unit 24b that performs human body communication via the electrode unit 24a. .
- the communication unit 36 performs human body communication with the communication unit 16 of the glasses-type device 10 described above.
- the PC 20 includes a display unit 21, an operation unit 22, a storage unit 23, a first communication unit 24, a second communication unit 25, a control unit 26, and the like.
- the PC 20 may be a desktop PC as shown in FIG. 3, or may be a notebook PC, a tablet terminal, or a smartphone.
- the second communication unit 25 communicates with other devices wirelessly or by wire.
- the second communication unit 25 communicates with a web server (not shown) using the Internet or the like.
- the control unit 26 includes a CPU, a RAM, a ROM, and the like, and controls the entire PC 20. In the present embodiment, the control unit 26 controls information communication with the eyeglass-type device 10 and the wristwatch-type device 30. In addition, the control unit 26 exchanges information with a Web server or the like to display a Web site on the display unit 21 or perform various information processing.
- the pairing is a state in which cooperative processing between a plurality of devices can be executed. Further, when the power supply of one device is off, the power supply of the device that is turned off may be turned on by establishing human body communication or proximity communication with the other device. In the present embodiment, it is assumed that human body communication can be performed between paired devices.
- FIG. 4 is a flowchart illustrating an example of authentication instruction transmission processing by the PC 20.
- the process of FIG. 4 is a process executed while the PC 20 is powered on.
- the control unit 26 determines whether or not use restriction information is necessary.
- the use restriction information is necessary when making a transfer at a bank (net banking) site, when making a credit card payment, or when viewing a document file with a set sensitivity.
- the determination in step S10 is affirmed and the process proceeds to step S12.
- step S10 is repeated at predetermined time intervals.
- step S10 When the determination in step S10 is affirmed and the process proceeds to step S12, the control unit 26 transmits an authentication instruction to the paired glasses-type device 10 and the wristwatch-type device 30. After the process of step S12 is performed, the process returns to step S10.
- FIG. 5 is a flowchart illustrating an example of processing performed by the glasses-type device 10.
- the process of FIG. 5 is a process executed while the glasses-type device 10 is powered on.
- the control unit 17 stands by until an authentication instruction is received from the PC 20.
- the process proceeds to step S22 at the timing when the process of step S12 of FIG. 4 described above is performed.
- step S22 the control unit 17 instructs the retinal information acquisition unit 18 to acquire the user's retinal information.
- step S24 the control unit 17 instructs the authentication unit 19 to execute user authentication of the eyeglass-type device 10 using the acquired retina information.
- the authentication unit 19 performs user authentication with reference to the user authentication table (FIG. 8A) stored in the storage unit 15.
- step S26 the control unit 17 determines whether an authentication result has been input from the wristwatch type device 30 or not.
- the case where the determination in step S26 is negative is a case where the user does not wear the wristwatch type device 30. If the determination is negative, the process proceeds to step S28. If the determination is positive, the process proceeds to step S30. If transfering it to step S28, the control part 17 will determine with the wristwatch type
- step S30 the control unit 17 determines whether or not the user authentication result of the eyeglass device 10 matches the user authentication result of the user of the wristwatch device 30.
- the process proceeds to step S32, and the control unit 17 generates use restriction information (none).
- the usage restriction information (none) is information indicating that the user can make a transfer or the like on the net banking site without restriction on the amount of money, or information indicating that there is no restriction on the amount of money used in credit card payment. Or information indicating that the user can use the document file with the confidentiality set without limitation.
- step S40 the control unit 17 transmits use restriction information (none) to the PC 20 via the communication unit 16, and ends the process of FIG.
- step S30 determines whether the user authentication results do not match. If transfering it to step S34, the control part 17 will judge whether user authentication of any one of the spectacles type apparatus 10 and the wristwatch type apparatus 30 failed.
- step S34 determines whether the eyeglass-type device 10 or the wristwatch-type device 30 is successful. If the determination in step S34 is affirmative, that is, if user authentication of either the eyeglass-type device 10 or the wristwatch-type device 30 is successful, the process proceeds to step S36, and the control unit 17 uses the usage restriction information ( (Partially limited).
- Usage restriction information (partial restriction) is information indicating that the user can make transfers, etc. under a certain amount of restrictions on the Internet banking site, and there are restrictions on the amount of money used in credit card payments. This is information indicating that a restriction is provided as compared to the case where there is not, and information indicating that a user can use a document file in which confidentiality is set under a predetermined restriction.
- step S36 the control unit 17 transmits use restriction information (partial restriction) to the PC 20 via the communication unit 16 in step S40, and the whole process of FIG.
- step S34 determines whether user authentication has failed in both the eyeglass-type device 10 and the wristwatch-type device 30 has failed in both the eyeglass-type device 10 and the wristwatch-type device 30, the process proceeds to step S38, and the control unit 17 uses the use restriction.
- Generate information all restrictions
- usage restriction information is information indicating that the user cannot perform transfers on the Internet banking site, information indicating that credit card payments cannot be made at all, and confidentiality settings. This is information indicating that the user cannot use the document file.
- step S38 the control unit 17 transmits use restriction information (all restrictions) to the PC 20 via the communication unit 16 in step S40, and the whole process of FIG.
- FIG. 6 is a flowchart illustrating an example of processing by the wristwatch type device 30.
- the process of FIG. 6 is a process executed while the wristwatch type device 30 is powered on.
- the control unit 37 stands by until an authentication instruction is received from the PC 20.
- the control unit 37 proceeds to step S52.
- step S52 the control unit 37 issues an instruction to the vein information acquisition unit 33 to acquire the vein information of the user.
- step S54 the control unit 37 instructs the authentication unit 35 to execute user authentication using the acquired user vein information.
- the authentication unit 35 performs user authentication with reference to the user authentication table (FIG. 8B) stored in the storage unit 34.
- step S56 the control unit 37 transmits the authentication result of step S54 to the glasses-type device 10 via the communication unit 36.
- the control part 37 transmits the information of authentication success or authentication failure to the spectacles type apparatus 10 as an authentication result.
- step S58 the control unit 37 determines whether or not the authentication result has been transmitted to the glasses-type device 10. If the determination here is affirmed, all the processes in FIG. 6 are terminated. When the authentication result can be transmitted, all the processes in FIG. 6 are terminated by the process using the authentication result (S30 to S40 in FIG. 5) in the glasses-type device 10 as described above. Because.
- step S58 determines whether the authentication result cannot be transmitted because the human body communication with the eyeglass-type device 10 is not established. If the determination in step S58 is negative, that is, if the authentication result cannot be transmitted because the human body communication with the eyeglass-type device 10 is not established, the control unit 37 proceeds to step S60. To do.
- step S60 the control unit 37 determines whether or not the user authentication is successful in the process of step S54. When the determination in step S60 is affirmed, the process proceeds to step S62, and the control unit 37 generates use restriction information (partial restriction). After step S62, the control unit 37 transmits use restriction information (partial restriction) to the PC 20 via the communication unit 36 in step S66, and the whole process of FIG.
- step S64 the case where it transfers to step S64 means the case where the user authentication in the spectacles type apparatus 10 is not performed, and the user authentication fails in the wristwatch type apparatus 30. For this reason, in step S64, the control unit 37 generates use restriction information (all restrictions). And the control part 37 transmits use restriction information (all restrictions) to PC20 via the communication part 36 in step S66, and complete
- FIG. 7 is a flowchart illustrating an example of usage restriction information utilization processing by the PC 20.
- the control unit 26 stands by until receiving use restriction information. That is, the control unit 26 proceeds to step S16 at the timing when step S40 of FIG. 5 is executed in the eyeglass-type device 10 or when step S66 of FIG. 6 is executed in the wristwatch-type device 30.
- the control unit 26 executes a process based on the use restriction information. For example, when the use restriction information (none) is received when the user accesses the net banking site, the control unit 26 transmits the use restriction information (none) to the server that manages the net banking site. To do. In this case, the server removes the restriction on the usage amount of the user based on the usage restriction information (none). As a result, the user can make a transfer or the like on the net banking site without any amount limitation. In addition, when the use restriction information (partial restriction) is received when the user accesses the net banking site, the control unit 26 uses the use restriction information (partial restriction) to the server that manages the net banking site. Limit).
- the server sets a limit on the usage amount of the user based on the usage restriction information (partial restriction). As a result, the user can make a transfer or the like at a net banking site under a predetermined amount limit. Further, when the use restriction information (all restrictions) is received when the user accesses the net banking site, the control unit 26 uses the use restriction information (all restrictions) for the server that manages the net banking site. Send. In this case, the server prohibits the use of the user based on the use restriction information (all restrictions). As a result, the user cannot perform transfer or the like on the net banking site. Note that the same processing as described above is performed at the time of credit card settlement.
- the control unit 26 when the use restriction information (none) is received when the user accesses the document file in which the confidentiality is set, the control unit 26 permits the user to access the document file.
- the use restriction information partial restriction
- the control unit 26 determines whether the confidentiality level (level) set in the document file is high. ) To allow the user access to the document file.
- the use restriction information all restrictions
- the control unit 26 accesses the document file with the confidentiality set from the user. Is prohibited.
- the control unit 17 of the eyeglass-type device 10 performs the user authentication based on the user authentication result using the user's retina and the user authentication result using the user's vein.
- Setting information (usage restriction information) related to the operation is generated (S32, S36, S38). Accordingly, it is possible to generate appropriate use restriction information based on two user authentication results, and it is possible to improve the security of the apparatus by using the appropriate use restriction information.
- the communication unit 16 of the eyeglass-type device 10 transmits use restriction information to the PC 20 by human body communication or proximity communication, so that the control unit 17 is a user in the vicinity of the PC 20. That is, user authentication information of a user who is likely to actually use the PC 20 can be acquired.
- control unit 37 of the wristwatch-type device 30 acquires an authentication result of a part of the user's body (arm vein) and transmits it to the eyeglass-type device 10 by human body communication via the communication unit 36. To do. Thereby, user authentication results can be exchanged between wearing devices.
- the spectacle-type device 10 and the wristwatch-type device 30 have been described as generating use restriction information for performing user operation settings on the PC 20 based on two user authentication results. It is not limited to.
- the eyeglass-type device 10 or the wristwatch-type device 30 may generate use restriction information for performing user operation settings in the eyeglass-type device 10 or the wristwatch-type device 30. Thereby, the security in the glasses-type device 10 and the wristwatch-type device 30 can be improved.
- the user authentication in the eyeglass-type device 10 is performed using the retina information.
- the present invention is not limited to this, and iris information may be used in the authentication related to the user's eyes.
- iris authentication is disclosed, for example, in Japanese Patent Application Laid-Open No. 2013-148961 (Patent No. 5360931).
- user authentication in the wristwatch-type device 30 is performed using vein information.
- fingerprint information may be used in authentication related to a user's hand.
- the user authentication method is not limited to eyes and hands, and authentication using other parts of the user may be employed.
- the user authentication process using the eyeglass-type device 10 and the wristwatch-type device 30 may be executed when the user views the display unit 21 of the PC 20.
- the control unit 17 recognizes from the captured image that the screen of the display unit 21 has been captured by the imaging unit 11, the control unit 17 starts the processing of FIG.
- the control unit 37 of the device 30 may be notified.
- mold apparatus 30 should just perform the process of FIG. 6 at the timing which notified.
- the control unit 17 displays whether or not the screen of the display unit 21 has been captured by the imaging unit 11, whether or not the characteristic portion of the display unit 21 is included in the captured image, or is displayed on the screen of the display unit 21. Judgment may be made based on whether or not a characteristic portion of the image is included.
- the wristwatch-type device 30 transmits the user authentication result to the eyeglass-type device 10 (S56)
- the present invention is not limited to this.
- the eyeglass-type device 10 may transmit a user authentication result to the wristwatch-type device 30.
- the use restriction information generated based on the user authentication result of the eyeglass-type device 10 and the user authentication result of the wristwatch-type device 30 is transmitted to the PC 20. It is not limited.
- the user authentication result of the eyeglass-type device 10 and the user authentication result of the wristwatch-type device 30 may be transmitted to the PC 20.
- the control unit 26 of the PC 20 may generate usage restriction information based on both user authentication results.
- the eyeglass-type device 10 and the wristwatch-type device 30 are adopted as devices worn by a person.
- the present invention is not limited to this, and other devices such as a contact lens type terminal may be used. It may be a device to be mounted.
- the device that receives the use restriction information is the PC 20
- the present invention is not limited to this.
- other devices such as an imaging device may be used.
- FIG. 9 is a block diagram showing the configuration of an electronic device system 100 ′ according to the second embodiment.
- the eyeglass-type device 10 includes an image recognition unit 101 and an OCR (OpticalOptCharacter Recognition) unit 102. It is different from the electronic device system 100 of the first embodiment in that it is provided. Note that at least one of the image recognition unit 101 and the OCR unit 102 may be provided on the PC 20 (which may be a tablet terminal or a smartphone), and the eyeglass device 10 may receive the result by human body communication or proximity communication. .
- the image recognition unit 101 extracts feature points of image data captured by the imaging unit 11 of the eyeglass-type device 10 using, for example, a feature amount detection algorithm such as SURF (Speeded Up Up Robust Features) or SIFT (Scale Invariant Feature Up Transform) and stores the feature points.
- the image is compared with the image stored in the unit 15.
- a feature amount detection algorithm such as SURF (Speeded Up Up Robust Features) or SIFT (Scale Invariant Feature Up Transform)
- SURF Speeded Up Up Robust Features
- SIFT Scale Invariant Feature Up Transform
- the OCR unit 102 recognizes characters included in the image data captured by the imaging unit 11 of the glasses-type device 10 and converts them into text data.
- a URL included in an image of a website imaged by the imaging unit 11 is recognized and converted into text data.
- the process of FIG. 10 is a process executed when the user instructs execution of a phishing countermeasure process through the operation unit 13.
- the control unit 17 acquires a captured image of the imaging unit 11. Note that the imaging unit 11 captures still images at predetermined time intervals.
- step S72 the control unit 17 issues an instruction to the image recognition unit 101, and determines whether or not the user has viewed the browser on the display unit 21 based on the captured image of the imaging unit 11.
- the image recognition unit 101 refers to, for example, the captured image and determines whether the mark attached to the upper left corner of the window (“*” mark in FIG. 11A) is a browser-specific mark. If the mark is unique to the browser, it is determined that the user has viewed the browser.
- step S72 If the determination in step S72 is negative, the process returns to step S70, but if the determination is positive, the process proceeds to step S74.
- step S74 the control unit 17 temporarily stores the captured image.
- step S ⁇ b> 76 the control unit 17 issues an instruction to the image recognition unit 101, and compares the captured image with a past browsing image stored in the storage unit 15. For example, it is assumed that the image captured by the imaging unit 11 this time is the image illustrated in FIG. In this case, the image recognition unit 101 compares the image shown in FIG. 11A with the past browsing images stored in the storage unit 15, and calculates the similarity with each browsing image.
- step S78 the control unit 17 determines whether there is a browse image having a similarity equal to or greater than a threshold value.
- a threshold value for example, a numerical value such as 80% to 90% can be adopted. If the determination in step S78 is negative, that is, if there is no browse image with a similarity equal to or greater than the threshold, the process proceeds to step S80, and the control unit 17 converts the captured image stored temporarily into the past browse image. Is stored in the storage unit 15. Thereafter, the process returns to step S70.
- the image captured this time shown in FIG. 11A has a similarity with a past browsing image as shown in FIG. 11B (an image when browsing the website of “Internet bank”) or more than a threshold value.
- the determination in step S78 is positive, and the control unit 17 proceeds to step S82.
- step S82 the control unit 17 uses the OCR unit 102 to compare the URL included in the captured image with the URL of the browsing image. It is assumed that the control unit 17 recognizes where the URL of the Web site is written in the browser. Therefore, the control unit 17 extracts the range in which the URL is described from the captured image and the browse image, and causes the OCR unit 102 to convert the text. Then, the control unit 17 compares the URLs converted into text. Note that the control unit 17 may compare the images in the URL portion using the image recognition unit 101.
- step S84 the control unit 17 determines whether or not the URLs are the same as a result of comparing the text URLs.
- the determination in step S84 is denied and the process proceeds to step S86.
- the Web site included in the captured image is a Web aimed at phishing It may be a site.
- step S84 determines whether the result of comparing the URLs is the same. If the determination in step S84 is affirmative, that is, if the result of comparing the URLs is the same, the process returns to step S70 without passing through step S86.
- the control unit 17 includes the URL included in the captured image captured by the imaging unit 11 and the URL included in the past browsing image stored in the storage unit 15. And whether or not the imaging unit 11 has captured information related to the past browsing image (whether the Web site viewed by the user and the Web site included in the past browsing image are the same) is determined. The user is warned according to the determination result. Thereby, it is possible to prevent the user from accessing a fake website (such as a phishing scam site) similar to a website browsed in the past at first glance.
- a fake website such as a phishing scam site
- control unit 17 appropriately stores the captured image captured using the imaging unit 11 in the storage unit 15 (S80). Thereby, it becomes possible to add the captured image used as the past browsing image to the memory
- control unit 17 transmits to the PC 20 a determination result as to whether or not the Web site viewed by the user is the same as the Web site included in the past browsing image, and the PC 20 To warn the user. Thereby, it is possible to display a warning in an easy-to-understand manner at an appropriate position for the user who uses the PC 20.
- the URL may be converted into text at the timing (S80) when the past browsing image is stored in the storage unit 15, and stored in the storage unit 15 in association with the past browsing image.
- the control unit 17 of the glasses-type device 10 performs image comparison and URL comparison.
- the comparison of images and URLs may be performed by the PC 20, the wristwatch type device 30, or a server (cloud server) connected to the PC 20.
- the control unit 17 of the glasses-type device 10 may transmit the image captured by the imaging unit 11 to the PC 20, the wristwatch-type device 30, or the server as needed.
- the control unit 17 of the eyeglass-type device 10 includes the URL included in the captured image captured by the imaging unit 11 and the URL included in the past browsing image stored in the storage unit 15.
- the image capturing unit 11 determines whether the information related to the past browsing image is captured.
- the control unit 17 of the eyeglass-type device 10 communicates with a predetermined Web server or the like using the Internet or the like, searches for a URL included in the captured image captured by the imaging unit 11, and authenticates the false (false) It may be determined whether the URL is a website URL, and the determination result may be output (notified) to the user. This eliminates the need to store URLs included in past browsing images in the storage unit 15, thereby simplifying the process and reducing the amount of data stored in the storage unit 15.
- the functions of the glasses-type device 10 and the wristwatch-type device 30 may be separated.
- a contact lens type terminal provided with a display unit another function may be given to a spectacle type device or a wristwatch type device.
- the above processing functions can be realized by a computer.
- a program describing the processing contents of the functions that the processing device (CPU) should have is provided.
- the program describing the processing contents can be recorded on a computer-readable recording medium (except for a carrier wave).
- the program When the program is distributed, for example, it is sold in the form of a portable recording medium such as a DVD (Digital Versatile Disc) or CD-ROM (Compact Disc Read Only Memory) on which the program is recorded. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
- a portable recording medium such as a DVD (Digital Versatile Disc) or CD-ROM (Compact Disc Read Only Memory) on which the program is recorded. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
- the computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
An electronic device equipped with a first input unit (18) into which an authentication result for a first part of a user is input, a second input unit (16) into which an authentication result for a second part of the user is input, and a generation unit (17) that, on the basis of the input information from the first and second input units, generates setting information related to a user operation.
Description
本発明は、電子機器に関する。
The present invention relates to an electronic device.
近年、眼鏡型や腕時計型などの身に着ける情報端末(ウェアラブル端末)が出現してきている(例えば特許文献1参照)。
In recent years, information terminals (wearable terminals) that can be worn such as glasses and wristwatches have appeared (see, for example, Patent Document 1).
しかしながら、特許文献1等では、ウェアラブル端末におけるセキュリティを高める点、又はウェアラブル端末を用いて他の装置におけるセキュリティを高める点について、十分考慮されているとはいえなかった。
However, in Patent Document 1 and the like, it cannot be said that sufficient consideration has been given to the point of increasing the security of the wearable terminal or the point of increasing the security of other devices using the wearable terminal.
本発明は上記の課題に鑑みてなされたものであり、セキュリティを向上することが可能な電子機器を提供することを目的とする。
The present invention has been made in view of the above-described problems, and an object thereof is to provide an electronic device capable of improving security.
本発明の電子機器は、ユーザの第1部分の認証結果を入力する第1入力部と、前記ユーザの第2部分の認証結果を入力する第2入力部と、前記第1、第2入力部の入力情報に基づき、前記ユーザの操作に関する設定情報を生成する生成部と、を備えている。
The electronic device of the present invention includes a first input unit that inputs an authentication result of the first part of the user, a second input unit that inputs an authentication result of the second part of the user, and the first and second input units. And a generating unit that generates setting information related to the user's operation based on the input information.
この場合において、外部機器に対して、前記設定情報を送信する送信部を備えていてもよい。また、前記送信部は、近接通信もしくは前記ユーザを介した人体通信により前記設定情報を送信することとしてもよい。
In this case, a transmission unit that transmits the setting information to an external device may be provided. Further, the transmission unit may transmit the setting information by proximity communication or human body communication via the user.
また、本発明の電子機器は、前記設定情報を用いて、前記ユーザの操作に関する設定を行う設定部を備えていてもよい。この場合、前記設定部は、前記第1、第2入力部の入力結果に基づき、前記ユーザの操作に制限を設定することとしてもよい。
The electronic apparatus of the present invention may further include a setting unit that performs settings related to the user's operation using the setting information. In this case, the setting unit may set a restriction on the user's operation based on the input results of the first and second input units.
また、本発明の電子機器は、前記第1入力部は、前記ユーザの目に関する認証結果を入力することとしてもよい。また、前記第2入力部は、前記ユーザの手に関する認証結果を入力することとしてもよい。また、前記ユーザの顔付近に装着するための装着部を備えていてもよい。
In the electronic device of the present invention, the first input unit may input an authentication result related to the user's eyes. The second input unit may input an authentication result related to the user's hand. Moreover, you may provide the mounting part for mounting | wearing the said user's face vicinity.
本発明の電子機器は、ユーザの体の一部の認証結果を入力する入力部と、前記入力部の入力情報を人体通信により外部機器に送信する送信部と、を備えている。この場合において、前記送信部が前記外部機器に前記入力情報を送信できなかった場合に、前記入力部が入力した前記認証結果に基づき、前記ユーザの操作に関する設定情報を生成する生成部を備えていてもよい。
The electronic device of the present invention includes an input unit that inputs an authentication result of a part of the user's body, and a transmission unit that transmits input information of the input unit to an external device through human body communication. In this case, when the transmission unit cannot transmit the input information to the external device, a generation unit is provided that generates setting information related to the user operation based on the authentication result input by the input unit. May be.
本発明の電子機器は、情報を記憶する記憶部と、撮像を行う撮像部と、前記撮像部が前記記憶部の記憶する情報と関連する情報を撮像したかどうか判断する判断部と、前記判断部の判断結果に応じて、ユーザに報知を行う報知部と、を備えている。
The electronic apparatus according to the present invention includes a storage unit that stores information, an imaging unit that performs imaging, a determination unit that determines whether the imaging unit captures information related to information stored in the storage unit, and the determination A notification unit that notifies the user according to the determination result of the unit.
この場合において、前記撮像部を用いて撮像した情報を前記記憶部に記憶することとしてもよい。また、前記記憶部はテキストデータを記憶し、前記判断部は、前記撮像部が撮像した撮像データに基づいて生成したテキストデータと、前記記憶部が記憶するテキストデータとに基づいた判断を行うこととしてもよい。また、前記判断部の判断結果を外部機器に送信する送信部を備えていてもよい。この場合、前記送信部は、近接通信もしくは前記ユーザを介した人体通信により前記情報を送信することとしてもよい。
In this case, information captured using the imaging unit may be stored in the storage unit. Further, the storage unit stores text data, and the determination unit performs determination based on text data generated based on imaging data captured by the imaging unit and text data stored in the storage unit. It is good. Moreover, you may provide the transmission part which transmits the determination result of the said determination part to an external apparatus. In this case, the transmission unit may transmit the information by proximity communication or human body communication via the user.
本発明の電子機器は、文字情報に関する撮像を行う撮像部と、外部装置と通信することで、前記撮像部が撮像した前記文字情報の真偽を判断する判断部と、前記判断部の判断結果に応じて、ユーザに報知を行う報知部と、を備えている。
The electronic apparatus according to the present invention includes an imaging unit that captures an image regarding character information, a determination unit that determines whether the character information captured by the imaging unit is in communication with an external device, and a determination result of the determination unit And a notification unit for reporting to the user.
本発明の電子機器は、ユーザの目に関する認証結果を入力する入力部と、前記入力部の入力情報に基づき、前記ユーザの操作に関する設定情報を生成する生成部と、を備えている。この場合において、前記生成部は、前記入力部が入力した認証結果に基づいて、前記設定情報に制限を行うこととしてもよい。
The electronic device of the present invention includes an input unit that inputs an authentication result related to the user's eyes, and a generation unit that generates setting information related to the user's operation based on the input information of the input unit. In this case, the generation unit may limit the setting information based on an authentication result input by the input unit.
本発明の電子機器は、ユーザの目に関する認証結果を入力する入力部と、前記入力部の入力情報を人体通信により外部機器に送信する送信部と、を備えている。この場合において、前記送信部は、前記ユーザによる前記外部機器の使用制限に関する情報を送信することとしてもよい。
The electronic device of the present invention includes an input unit that inputs an authentication result related to the user's eyes, and a transmission unit that transmits input information of the input unit to an external device through human body communication. In this case, the transmission unit may transmit information related to use restriction of the external device by the user.
本発明の電子機器は、セキュリティを向上することができるという効果を奏する。
The electronic device of the present invention has an effect that security can be improved.
《第1の実施形態》
以下、第1の実施形態に係る電子機器システムについて、図1~図8(b)に基づいて、詳細に説明する。図1には、第1の実施形態に係る電子機器システム100の構成がブロック図にて示されている。 << First Embodiment >>
Hereinafter, the electronic device system according to the first embodiment will be described in detail with reference to FIGS. 1 to 8B. FIG. 1 is a block diagram showing the configuration of anelectronic device system 100 according to the first embodiment.
以下、第1の実施形態に係る電子機器システムについて、図1~図8(b)に基づいて、詳細に説明する。図1には、第1の実施形態に係る電子機器システム100の構成がブロック図にて示されている。 << First Embodiment >>
Hereinafter, the electronic device system according to the first embodiment will be described in detail with reference to FIGS. 1 to 8B. FIG. 1 is a block diagram showing the configuration of an
電子機器システム100は、図1に示すように、眼鏡型装置10と、腕時計型装置30と、PC(Personal Computer)20と、を備える(利用する)。図2に示すように、眼鏡型装置10は、ユーザが顔に装着するウェアラブル端末である。また、腕時計型装置30は、ユーザが腕に装着するウェアラブル端末である。なお、本実施形態では、ユーザは、眼鏡型装置10及び腕時計型装置30の一方又は両方を装着しなくてもPC20を使用することができるが、眼鏡型装置10及び腕時計型装置30の一方又は両方を装着しないと、Webサイトの利用が制限されたり、PC20において閲覧できる情報が制限されるようになっている。
As shown in FIG. 1, the electronic device system 100 includes (utilizes) a glasses-type device 10, a wristwatch-type device 30, and a PC (Personal Computer) 20. As shown in FIG. 2, the eyeglass-type device 10 is a wearable terminal worn by a user on the face. The wristwatch-type device 30 is a wearable terminal that a user wears on his / her arm. In this embodiment, the user can use the PC 20 without wearing one or both of the eyeglass-type device 10 and the wristwatch-type device 30, but one of the eyeglass-type device 10 and the wristwatch-type device 30 or If both are not installed, use of the website is restricted, and information that can be viewed on the PC 20 is restricted.
(眼鏡型装置10)
眼鏡型装置10は、図1に示すように、撮像部11、表示部12、操作部13、マイク14、記憶部15、通信部16、網膜情報取得部18、認証部19および制御部17等を備える。なお、図3には、眼鏡型装置10が斜視図にて示されている。図3に示すように、眼鏡型装置10は、眼鏡型のフレーム110を備えている。なお、図1において図示され、図3において図示されていない眼鏡型装置10の構成は、フレーム110の内部や、フレーム110の一部に設けられているものとする。 (Glasses type device 10)
As shown in FIG. 1, the eyeglass-type device 10 includes an imaging unit 11, a display unit 12, an operation unit 13, a microphone 14, a storage unit 15, a communication unit 16, a retina information acquisition unit 18, an authentication unit 19, a control unit 17, and the like. Is provided. In FIG. 3, the eyeglass-type device 10 is shown in a perspective view. As shown in FIG. 3, the spectacle-type device 10 includes a spectacle-type frame 110. The configuration of the eyeglass-type device 10 illustrated in FIG. 1 and not illustrated in FIG. 3 is provided inside the frame 110 or a part of the frame 110.
眼鏡型装置10は、図1に示すように、撮像部11、表示部12、操作部13、マイク14、記憶部15、通信部16、網膜情報取得部18、認証部19および制御部17等を備える。なお、図3には、眼鏡型装置10が斜視図にて示されている。図3に示すように、眼鏡型装置10は、眼鏡型のフレーム110を備えている。なお、図1において図示され、図3において図示されていない眼鏡型装置10の構成は、フレーム110の内部や、フレーム110の一部に設けられているものとする。 (Glasses type device 10)
As shown in FIG. 1, the eyeglass-
撮像部11は、レンズ、撮像素子、画像処理部などを備え、静止画や動画を撮像するものである。撮像部11は、図3に示すようにフレーム110の端部近傍(ユーザの右目近傍)に設けられている。このため、ユーザが眼鏡型装置10を装着した状態(図2の状態)では、ユーザが向いている(見ている)方向の画像を撮像することができる。
The imaging unit 11 includes a lens, an imaging device, an image processing unit, and the like, and captures still images and moving images. As shown in FIG. 3, the imaging unit 11 is provided near the end of the frame 110 (near the user's right eye). For this reason, when the user wears the glasses-type device 10 (the state shown in FIG. 2), it is possible to capture an image in the direction in which the user is facing (looking at).
表示部12は、フレーム110内部又はフレーム110近傍に設けられたプロジェクタと、プロジェクタからの投影像をユーザの目に導くためのプリズムとを有している。表示部12は、制御部17の指示の下、各種情報を表示する。
The display unit 12 includes a projector provided in or near the frame 110, and a prism for guiding a projection image from the projector to the eyes of the user. The display unit 12 displays various information under the instruction of the control unit 17.
操作部13は、フレーム110に設けられたタッチパッドであり、ユーザの指の動きを検知して、ユーザからの操作を受け付け、受け付けた操作情報を制御部17に送信する。
The operation unit 13 is a touch pad provided on the frame 110, detects the movement of the user's finger, receives an operation from the user, and transmits the received operation information to the control unit 17.
なお、撮像部11、表示部12、操作部13については、例えば、米国公開特許2013/0044042号にもその詳細が開示されている。
The details of the imaging unit 11, the display unit 12, and the operation unit 13 are also disclosed in, for example, US Published Patent No. 2013/0044042.
マイク14は、フレーム110に設けられ、ユーザが発した音声やユーザの周辺の音声を収集する。マイク14が収集した音声は、不図示の音声認識部により音声認識され、音声認識結果は、制御部17に送信される。制御部17は、音声認識結果に基づく処理(例えば、コマンドの実行処理など)を実行する。なお、制御部17が音声認識を実行してもよい。
The microphone 14 is provided in the frame 110 and collects voices uttered by the user and voices around the user. The voice collected by the microphone 14 is voice-recognized by a voice recognition unit (not shown), and the voice recognition result is transmitted to the control unit 17. The control unit 17 executes processing based on the voice recognition result (for example, command execution processing). The control unit 17 may perform voice recognition.
記憶部15は、例えば、フラッシュメモリ等の不揮発性の半導体メモリであり、撮像部11が撮像した画像データ、認証部19の認証に用いられるデータ(図8(a)のユーザ認証テーブル参照)、表示部12に表示する表示データ、及び各種プログラムなどを記憶する。
The storage unit 15 is a non-volatile semiconductor memory such as a flash memory, for example, image data captured by the imaging unit 11, data used for authentication of the authentication unit 19 (see the user authentication table in FIG. 8A), The display data to be displayed on the display unit 12 and various programs are stored.
通信部16は、他の機器と人体通信を行うものである。通信部16は、フレーム110に設けられユーザと接触可能な電極部16aと、電極部16aを用いて人体通信を行う人体通信部16bと、を有する。なお、人体通信には、人体に微弱な電流を流して、その電流を変調して情報を伝達する電流方式や、人体の表面に誘起する電界を変調して情報を伝達する電界方式などがある。本実施形態では、電流方式及び電界方式のいずれのタイプを用いることも可能である。なお、ユーザが眼鏡型装置10の電極部16aに直接接触する場合はもちろん、電極部16aとユーザとが直接接触していない場合でも、人体通信は可能である。本実施形態において、通信部16は、腕時計型装置30及びPC20との間で通信を行う。なお、通信部16は、他の機器と例えばBluetooth(登録商標)、RFID(Radio Frequency Identification)、TransferJet(登録商標)などの近接通信を行ってもよい。
The communication unit 16 performs human body communication with other devices. The communication unit 16 includes an electrode unit 16a that is provided on the frame 110 and can contact a user, and a human body communication unit 16b that performs human body communication using the electrode unit 16a. Human body communication includes a current system in which a weak current is passed through the human body and the current is modulated to transmit information, and an electric field system in which information is transmitted by modulating an electric field induced on the surface of the human body. . In this embodiment, it is possible to use either the current method or the electric field method. Note that human body communication is possible not only when the user directly contacts the electrode portion 16a of the eyeglass-type device 10 but also when the electrode portion 16a and the user are not in direct contact. In the present embodiment, the communication unit 16 communicates with the wristwatch type device 30 and the PC 20. Note that the communication unit 16 may perform near field communication such as Bluetooth (registered trademark), RFID (Radio Frequency Identification), Transfer Jet (registered trademark), or the like with other devices.
網膜情報取得部18は、赤外線照射部と、赤外線受光部と、を有する。網膜情報取得部18は、赤外線照射部から眼球に対して赤外線を照射し、眼球にて反射した赤外線を赤外線受光部において受光することで、網膜上の血管の経路を検出(スキャン)する。そして、網膜情報取得部18は、検出結果を、制御部17に出力する。なお、網膜情報取得部18は、一例として図2における撮像部11の裏側(顔側)に設けられている。
The retina information acquisition unit 18 includes an infrared irradiation unit and an infrared light receiving unit. The retinal information acquisition unit 18 detects (scans) the path of blood vessels on the retina by irradiating the eyeball with infrared rays from the infrared irradiation unit and receiving the infrared rays reflected by the eyeballs with the infrared light reception unit. Then, the retinal information acquisition unit 18 outputs the detection result to the control unit 17. The retinal information acquisition unit 18 is provided on the back side (face side) of the imaging unit 11 in FIG. 2 as an example.
認証部19は、網膜情報取得部18が取得したユーザの網膜の情報を用いて、ユーザ認証を実行する。具体的には、認証部19は、取得したユーザの網膜の情報が、図8(a)に示すユーザ認証テーブルに登録されている正規ユーザ(ユーザID=123456)の網膜パターン(m123456.jpg)と一致するか否かを判断することにより、ユーザ認証を行う。
The authentication unit 19 executes user authentication using the information of the user's retina acquired by the retina information acquisition unit 18. Specifically, the authentication unit 19 uses the retina pattern (m123456.jpg) of the authorized user (user ID = 123456) whose acquired user retina information is registered in the user authentication table shown in FIG. The user authentication is performed by determining whether or not it matches.
制御部17は、眼鏡型装置10全体を統括的に制御する。制御部17は、CPU(Central Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)等を備える。本実施形態では、制御部17は、撮像部11、網膜情報取得部18の制御や、認証部19及び表示部12の制御などを行う。
The control unit 17 comprehensively controls the entire eyeglass-type device 10. The control unit 17 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like. In the present embodiment, the control unit 17 performs control of the imaging unit 11 and the retinal information acquisition unit 18, control of the authentication unit 19 and the display unit 12, and the like.
(腕時計型装置30)
腕時計型装置30は、図1に示すように、表示部31、操作部32、静脈情報取得部33、記憶部34、認証部35、通信部36、制御部37等を備える。 (Watch type device 30)
As shown in FIG. 1, thewristwatch type device 30 includes a display unit 31, an operation unit 32, a vein information acquisition unit 33, a storage unit 34, an authentication unit 35, a communication unit 36, a control unit 37, and the like.
腕時計型装置30は、図1に示すように、表示部31、操作部32、静脈情報取得部33、記憶部34、認証部35、通信部36、制御部37等を備える。 (Watch type device 30)
As shown in FIG. 1, the
表示部31は、有機EL(Electro-Luminescence)ディスプレイや液晶ディスプレイ等を含み、制御部37の指示の下、各種情報を表示する。操作部32は、タッチパネルやボタン等を含み、ユーザの操作を受け付け、制御部37に送信する。
The display unit 31 includes an organic EL (Electro-Luminescence) display, a liquid crystal display, and the like, and displays various types of information under instructions from the control unit 37. The operation unit 32 includes a touch panel, buttons, and the like, receives user operations, and transmits them to the control unit 37.
静脈情報取得部33は、ユーザが腕時計型装置30を装着した状態(図2の状態)で、ユーザの手首に対向することが可能な位置に設けられている。静脈情報取得部33は、赤外線照射部と、赤外線受光部と、を有する。静脈情報取得部33は、赤外線照射部から手首に対して赤外線を照射し、手首の静脈において反射した赤外線を赤外線受光部において受光することで、静脈の形状を検出(スキャン)する。なお、静脈情報取得部33は、検出結果を、制御部37に出力する。
The vein information acquisition unit 33 is provided at a position where the user can face the wrist of the user in a state where the user wears the wristwatch type device 30 (the state shown in FIG. 2). The vein information acquisition unit 33 includes an infrared irradiation unit and an infrared light reception unit. The vein information acquisition unit 33 detects (scans) the shape of the vein by irradiating the wrist with infrared rays from the infrared irradiation unit and receiving infrared rays reflected by the veins on the wrist with the infrared light reception unit. The vein information acquisition unit 33 outputs the detection result to the control unit 37.
記憶部34は、例えば、フラッシュメモリ等の不揮発性の半導体メモリであり、認証部35の認証に用いられるデータ(図8(b)のユーザ認証テーブル参照)、表示部31に表示する表示データ、及び各種プログラムなどを記憶する。
The storage unit 34 is a non-volatile semiconductor memory such as a flash memory, for example, data used for authentication of the authentication unit 35 (see the user authentication table in FIG. 8B), display data displayed on the display unit 31, And various programs are stored.
認証部35は、静脈情報取得部33で取得されたユーザの腕の静脈の情報を用いて、ユーザ認証を行う。具体的には、認証部35は、取得したユーザの静脈の情報が、図8(b)に示すユーザ認証テーブルに登録されている正規ユーザ(ユーザID=123456)の静脈パターン(j123456.jpg)と一致するか否かを判断することにより、ユーザ認証を行う。
The authentication unit 35 performs user authentication using information on the veins of the user's arm acquired by the vein information acquisition unit 33. Specifically, the authentication unit 35 uses the vein pattern (j123456.jpg) of the regular user (user ID = 123456) registered in the user authentication table shown in FIG. The user authentication is performed by determining whether or not it matches.
通信部36は、他の機器と人体通信を行うものである。なお、通信部36は、他の機器と近接通信を行うこととしてもよい。本実施形態においては、通信部36は、人体通信を用いて眼鏡型装置10の通信部16及びPC20の第1通信部24と通信を行う。ここで、通信部36は、腕時計型装置30のバンド部分に設けられた、ユーザの腕と接触可能な電極部24aと、電極部24aを介して人体通信を行う人体通信部24bと、を有する。通信部36は、前述した眼鏡型装置10の通信部16との間で人体通信を行う。
The communication unit 36 performs human body communication with other devices. Note that the communication unit 36 may perform near field communication with other devices. In the present embodiment, the communication unit 36 communicates with the communication unit 16 of the eyeglass-type device 10 and the first communication unit 24 of the PC 20 using human body communication. Here, the communication unit 36 includes an electrode unit 24a that can be in contact with the user's arm, and a human body communication unit 24b that performs human body communication via the electrode unit 24a. . The communication unit 36 performs human body communication with the communication unit 16 of the glasses-type device 10 described above.
制御部37は、CPU、RAM、ROM等を備えており、腕時計型装置30全体を統括的に制御する。
The control unit 37 includes a CPU, a RAM, a ROM, and the like, and comprehensively controls the wristwatch type device 30 as a whole.
(PC20)
PC20は、図1に示すように、表示部21、操作部22、記憶部23、第1通信部24、第2通信部25、制御部26等を備える。なお、PC20は、図3に示すようなデスクトップ型のPCであってもよいし、ノートPCやタブレット型の端末やスマートフォンであってもよい。 (PC20)
As shown in FIG. 1, thePC 20 includes a display unit 21, an operation unit 22, a storage unit 23, a first communication unit 24, a second communication unit 25, a control unit 26, and the like. The PC 20 may be a desktop PC as shown in FIG. 3, or may be a notebook PC, a tablet terminal, or a smartphone.
PC20は、図1に示すように、表示部21、操作部22、記憶部23、第1通信部24、第2通信部25、制御部26等を備える。なお、PC20は、図3に示すようなデスクトップ型のPCであってもよいし、ノートPCやタブレット型の端末やスマートフォンであってもよい。 (PC20)
As shown in FIG. 1, the
表示部21は、液晶ディスプレイ等を含み、各種情報を表示する。操作部22は、キーボードやマウス等を含み、ユーザからの入力を受け付け、制御部26に送信する。記憶部23は、HDD(Hard Disk Drive)等を含み、各種情報を記憶する。
The display unit 21 includes a liquid crystal display or the like and displays various information. The operation unit 22 includes a keyboard, a mouse, and the like, receives input from the user, and transmits the input to the control unit 26. The storage unit 23 includes an HDD (Hard Disk Drive) and the like, and stores various types of information.
第1通信部24は、他の機器と人体通信を行うものである。なお、第1通信部24は、他の機器と近接通信を行うこととしてもよい。本実施形態においては、第1通信部24は、人体通信を用いて眼鏡型装置10の通信部16及び腕時計型装置30の通信部36と通信を行う。第1通信部24は、例えば操作部22(キーボード)の一部に設けられ、ユーザの手や腕と接触可能な電極部24aと、電極部24aを介して人体通信を行う人体通信部24bと、を有する。
The first communication unit 24 performs human body communication with other devices. Note that the first communication unit 24 may perform near field communication with other devices. In the present embodiment, the first communication unit 24 communicates with the communication unit 16 of the eyeglass-type device 10 and the communication unit 36 of the wristwatch-type device 30 using human body communication. The first communication unit 24 is provided in, for example, a part of the operation unit 22 (keyboard), an electrode unit 24a that can be in contact with a user's hand or arm, and a human body communication unit 24b that performs human body communication via the electrode unit 24a. Have.
第2通信部25は、他の装置との間で無線又は有線にて通信を行うものである。例えば、第2通信部25は、インターネット等を利用して、不図示のWebサーバ等との間で通信を行う。
The second communication unit 25 communicates with other devices wirelessly or by wire. For example, the second communication unit 25 communicates with a web server (not shown) using the Internet or the like.
制御部26は、CPU、RAM、ROM等を備え、PC20全体を制御するものである。本実施形態では、制御部26は、眼鏡型装置10や腕時計型装置30との間における情報通信の制御を行う。また、制御部26は、Webサーバ等との間で情報のやり取りを行って、Webサイトを表示部21に表示したり、各種情報処理を行ったりする。
The control unit 26 includes a CPU, a RAM, a ROM, and the like, and controls the entire PC 20. In the present embodiment, the control unit 26 controls information communication with the eyeglass-type device 10 and the wristwatch-type device 30. In addition, the control unit 26 exchanges information with a Web server or the like to display a Web site on the display unit 21 or perform various information processing.
次に、電子機器システム100の各装置の処理について説明する。なお、電子機器システム100の処理の前提として、PC20と特定の眼鏡型装置10とがペアリングされ、PC20と特定の腕時計型装置30とがペアリングされているものとする。本実施形態においてペアリングとは、複数の装置間の協働処理を実行され得る状態にすることである。また、一方の装置の電源がオフの場合に他方の装置との人体通信や近接通信が成立することにより電源がオフの装置の電源をオンにするようにしてもよい。なお、本実施形態では、ペアリングされている装置間で人体通信ができるようになっているものとする。
Next, processing of each device of the electronic device system 100 will be described. As a premise of processing of electronic device system 100, it is assumed that PC 20 and specific spectacle-type device 10 are paired, and PC 20 and specific watch-type device 30 are paired. In the present embodiment, the pairing is a state in which cooperative processing between a plurality of devices can be executed. Further, when the power supply of one device is off, the power supply of the device that is turned off may be turned on by establishing human body communication or proximity communication with the other device. In the present embodiment, it is assumed that human body communication can be performed between paired devices.
(PC20による認証指示送信処理)
図4は、PC20による認証指示送信処理の一例を示すフローチャートである。図4の処理は、PC20に電源が投入されている間において実行される処理である。図4の処理では、ステップS10において、制御部26が、使用制限情報が必要か否かを判断する。ここで、使用制限情報が必要な場合とは、銀行(ネットバンキング)のサイトにおいて振込みをする場合や、クレジット決済をする場合、機密度が設定された文書ファイルを閲覧する場合などである。例えば、ユーザがネットバンキングのサイトで振込みを実行しようとしている場合に、ステップS10の判断は肯定されて、ステップS12に移行する。一方、ステップS10の判断が否定されている間は、ステップS10を所定時間間隔で繰り返す。 (Authentication instruction transmission processing by PC 20)
FIG. 4 is a flowchart illustrating an example of authentication instruction transmission processing by thePC 20. The process of FIG. 4 is a process executed while the PC 20 is powered on. In the process of FIG. 4, in step S10, the control unit 26 determines whether or not use restriction information is necessary. Here, the use restriction information is necessary when making a transfer at a bank (net banking) site, when making a credit card payment, or when viewing a document file with a set sensitivity. For example, when the user intends to perform transfer at a net banking site, the determination in step S10 is affirmed and the process proceeds to step S12. On the other hand, while the determination in step S10 is negative, step S10 is repeated at predetermined time intervals.
図4は、PC20による認証指示送信処理の一例を示すフローチャートである。図4の処理は、PC20に電源が投入されている間において実行される処理である。図4の処理では、ステップS10において、制御部26が、使用制限情報が必要か否かを判断する。ここで、使用制限情報が必要な場合とは、銀行(ネットバンキング)のサイトにおいて振込みをする場合や、クレジット決済をする場合、機密度が設定された文書ファイルを閲覧する場合などである。例えば、ユーザがネットバンキングのサイトで振込みを実行しようとしている場合に、ステップS10の判断は肯定されて、ステップS12に移行する。一方、ステップS10の判断が否定されている間は、ステップS10を所定時間間隔で繰り返す。 (Authentication instruction transmission processing by PC 20)
FIG. 4 is a flowchart illustrating an example of authentication instruction transmission processing by the
ステップS10の判断が肯定されて、ステップS12に移行すると、制御部26は、ペアリングされている眼鏡型装置10と腕時計型装置30に対して認証指示を送信する。ステップS12の処理が行われた後は、ステップS10に戻る。
When the determination in step S10 is affirmed and the process proceeds to step S12, the control unit 26 transmits an authentication instruction to the paired glasses-type device 10 and the wristwatch-type device 30. After the process of step S12 is performed, the process returns to step S10.
(眼鏡型装置10による処理)
図5は、眼鏡型装置10による処理の一例を示すフローチャートである。図5の処理は、眼鏡型装置10に電源が投入されている間において実行される処理である。図5の処理では、ステップS20において、制御部17が、PC20から認証指示を受信するまで待機する。この場合、前述した図4のステップS12の処理が行われたタイミングで、ステップS22に移行する。 (Processing by the glasses-type device 10)
FIG. 5 is a flowchart illustrating an example of processing performed by the glasses-type device 10. The process of FIG. 5 is a process executed while the glasses-type device 10 is powered on. In the process of FIG. 5, in step S <b> 20, the control unit 17 stands by until an authentication instruction is received from the PC 20. In this case, the process proceeds to step S22 at the timing when the process of step S12 of FIG. 4 described above is performed.
図5は、眼鏡型装置10による処理の一例を示すフローチャートである。図5の処理は、眼鏡型装置10に電源が投入されている間において実行される処理である。図5の処理では、ステップS20において、制御部17が、PC20から認証指示を受信するまで待機する。この場合、前述した図4のステップS12の処理が行われたタイミングで、ステップS22に移行する。 (Processing by the glasses-type device 10)
FIG. 5 is a flowchart illustrating an example of processing performed by the glasses-
ステップS22に移行すると、制御部17は、網膜情報取得部18に指示を出し、ユーザの網膜情報を取得する。
In step S22, the control unit 17 instructs the retinal information acquisition unit 18 to acquire the user's retinal information.
次いで、ステップS24では、制御部17は、認証部19に指示を出し、取得した網膜情報を用いた、眼鏡型装置10のユーザ認証を実行する。この場合、認証部19は、記憶部15に格納されているユーザ認証テーブル(図8(a))を参照して、ユーザ認証を実行する。
Next, in step S24, the control unit 17 instructs the authentication unit 19 to execute user authentication of the eyeglass-type device 10 using the acquired retina information. In this case, the authentication unit 19 performs user authentication with reference to the user authentication table (FIG. 8A) stored in the storage unit 15.
次いで、ステップS26では、制御部17が、腕時計型装置30から認証結果が入力されたか否かを判断する。なお、ステップS26の判断が否定される場合とは、ユーザが腕時計型装置30を装着していない場合である。ここでの判断が否定された場合には、ステップS28に移行し、肯定された場合には、ステップS30に移行する。ステップS28に移行すると、制御部17は、腕時計型装置30においてユーザ認証に失敗したと判定する。その後は、ステップS30に移行する。
Next, in step S26, the control unit 17 determines whether an authentication result has been input from the wristwatch type device 30 or not. The case where the determination in step S26 is negative is a case where the user does not wear the wristwatch type device 30. If the determination is negative, the process proceeds to step S28. If the determination is positive, the process proceeds to step S30. If transfering it to step S28, the control part 17 will determine with the wristwatch type | mold apparatus 30 having failed user authentication. Thereafter, the process proceeds to step S30.
ステップS30に移行すると、制御部17は、眼鏡型装置10のユーザ認証結果と、腕時計型装置30のユーザのユーザ認証結果とが一致したか否かを判断する。ここでの判断が肯定された場合には、ステップS32に移行し、制御部17は、使用制限情報(なし)を生成する。なお、使用制限情報(なし)とは、ユーザがネットバンキングのサイトにおいて金額の制限なく振り込み等を行うことができることを示す情報や、クレジットカード決済において利用金額に制限が設けられないことを示す情報や、機密度が設定された文書ファイルをユーザが制限なく利用できることを示す情報である。ステップS32の後は、ステップS40に移行し、制御部17は、通信部16を介して、使用制限情報(なし)をPC20に対して送信し、図5の処理を終了する。
In step S30, the control unit 17 determines whether or not the user authentication result of the eyeglass device 10 matches the user authentication result of the user of the wristwatch device 30. When the determination here is affirmed, the process proceeds to step S32, and the control unit 17 generates use restriction information (none). The usage restriction information (none) is information indicating that the user can make a transfer or the like on the net banking site without restriction on the amount of money, or information indicating that there is no restriction on the amount of money used in credit card payment. Or information indicating that the user can use the document file with the confidentiality set without limitation. After step S32, the process proceeds to step S40, and the control unit 17 transmits use restriction information (none) to the PC 20 via the communication unit 16, and ends the process of FIG.
一方、ステップS30の判断が否定された場合、すなわち、ユーザ認証結果が不一致であった場合には、ステップS34に移行する。ステップS34に移行すると、制御部17は、眼鏡型装置10及び腕時計型装置30のいずれか一方のユーザ認証に失敗したか否かを判断する。
On the other hand, if the determination in step S30 is negative, that is, if the user authentication results do not match, the process proceeds to step S34. If transfering it to step S34, the control part 17 will judge whether user authentication of any one of the spectacles type apparatus 10 and the wristwatch type apparatus 30 failed.
ステップS34の判断が肯定された場合、すなわち、眼鏡型装置10及び腕時計型装置30のいずれか一方のユーザ認証に成功した場合には、ステップS36に移行し、制御部17は、使用制限情報(一部制限)を生成する。なお、使用制限情報(一部制限)とは、ユーザが、ネットバンキングのサイトにおいて、一定金額の制限の下、振り込み等を行うことができることを示す情報や、クレジットカード決済において利用金額に制限がない場合に比べて制限が設けられることを示す情報や、機密度が設定された文書ファイルをユーザが所定制限の下で利用できることを示す情報である。ステップS36の後は、制御部17は、ステップS40において、通信部16を介して使用制限情報(一部制限)をPC20に送信し、図5の全処理を終了する。
If the determination in step S34 is affirmative, that is, if user authentication of either the eyeglass-type device 10 or the wristwatch-type device 30 is successful, the process proceeds to step S36, and the control unit 17 uses the usage restriction information ( (Partially limited). Usage restriction information (partial restriction) is information indicating that the user can make transfers, etc. under a certain amount of restrictions on the Internet banking site, and there are restrictions on the amount of money used in credit card payments. This is information indicating that a restriction is provided as compared to the case where there is not, and information indicating that a user can use a document file in which confidentiality is set under a predetermined restriction. After step S36, the control unit 17 transmits use restriction information (partial restriction) to the PC 20 via the communication unit 16 in step S40, and the whole process of FIG.
これに対し、ステップS34の判断が否定された場合、すなわち、眼鏡型装置10及び腕時計型装置30の両方でユーザ認証に失敗した場合には、ステップS38に移行し、制御部17は、使用制限情報(全部制限)を生成する。なお、使用制限情報(全部制限)とは、ユーザがネットバンキングのサイトにおいて振り込み等を行うことができないことを示す情報や、クレジットカード決済を一切できなくすることを示す情報や、機密度が設定された文書ファイルをユーザが利用できないことを示す情報である。ステップS38の後は、制御部17は、ステップS40において、通信部16を介して使用制限情報(全部制限)をPC20に送信し、図5の全処理を終了する。
On the other hand, if the determination in step S34 is negative, that is, if user authentication has failed in both the eyeglass-type device 10 and the wristwatch-type device 30, the process proceeds to step S38, and the control unit 17 uses the use restriction. Generate information (all restrictions). In addition, usage restriction information (all restrictions) is information indicating that the user cannot perform transfers on the Internet banking site, information indicating that credit card payments cannot be made at all, and confidentiality settings. This is information indicating that the user cannot use the document file. After step S38, the control unit 17 transmits use restriction information (all restrictions) to the PC 20 via the communication unit 16 in step S40, and the whole process of FIG.
(腕時計型装置30による処理)
図6は、腕時計型装置30による処理の一例を示すフローチャートである。図6の処理は、腕時計型装置30に電源が投入されている間に実行される処理である。図6の処理では、ステップS50において、制御部37が、PC20から認証指示を受信するまで待機する。この場合、PC20において、図5のステップS12が行われたタイミングで、制御部37は、ステップS52に移行する。 (Processing by the watch-type device 30)
FIG. 6 is a flowchart illustrating an example of processing by thewristwatch type device 30. The process of FIG. 6 is a process executed while the wristwatch type device 30 is powered on. In the process of FIG. 6, in step S <b> 50, the control unit 37 stands by until an authentication instruction is received from the PC 20. In this case, at the timing when step S12 of FIG. 5 is performed in the PC 20, the control unit 37 proceeds to step S52.
図6は、腕時計型装置30による処理の一例を示すフローチャートである。図6の処理は、腕時計型装置30に電源が投入されている間に実行される処理である。図6の処理では、ステップS50において、制御部37が、PC20から認証指示を受信するまで待機する。この場合、PC20において、図5のステップS12が行われたタイミングで、制御部37は、ステップS52に移行する。 (Processing by the watch-type device 30)
FIG. 6 is a flowchart illustrating an example of processing by the
ステップS52に移行すると、制御部37は、静脈情報取得部33に指示を出し、ユーザの静脈情報を取得する。
In step S52, the control unit 37 issues an instruction to the vein information acquisition unit 33 to acquire the vein information of the user.
次いで、ステップS54では、制御部37が、認証部35に指示を出し、取得したユーザの静脈情報を用いたユーザ認証を実行する。この場合、認証部35は、記憶部34に格納されているユーザ認証テーブル(図8(b))を参照して、ユーザ認証を実行する。
Next, in step S54, the control unit 37 instructs the authentication unit 35 to execute user authentication using the acquired user vein information. In this case, the authentication unit 35 performs user authentication with reference to the user authentication table (FIG. 8B) stored in the storage unit 34.
次いで、ステップS56では、制御部37が、通信部36を介して、ステップS54の認証結果を眼鏡型装置10に送信する。なお、制御部37は、認証結果として、認証成功又は認証失敗の情報を眼鏡型装置10に送信する。
Next, in step S56, the control unit 37 transmits the authentication result of step S54 to the glasses-type device 10 via the communication unit 36. In addition, the control part 37 transmits the information of authentication success or authentication failure to the spectacles type apparatus 10 as an authentication result.
次いで、ステップS58では、制御部37が、眼鏡型装置10に対する認証結果の送信ができたか否かを判断する。ここでの判断が肯定された場合には、図6の全処理を終了する。なお、認証結果の送信ができた場合に、図6の全処理を終了するのは、前述したように、眼鏡型装置10において認証結果を利用した処理(図5のS30~S40)が行われるからである。
Next, in step S58, the control unit 37 determines whether or not the authentication result has been transmitted to the glasses-type device 10. If the determination here is affirmed, all the processes in FIG. 6 are terminated. When the authentication result can be transmitted, all the processes in FIG. 6 are terminated by the process using the authentication result (S30 to S40 in FIG. 5) in the glasses-type device 10 as described above. Because.
一方、ステップS58の判断が否定された場合、すなわち、眼鏡型装置10との人体通信が成立していないため、認証結果の送信ができなかった場合には、制御部37は、ステップS60に移行する。
On the other hand, if the determination in step S58 is negative, that is, if the authentication result cannot be transmitted because the human body communication with the eyeglass-type device 10 is not established, the control unit 37 proceeds to step S60. To do.
ステップS60に移行すると、制御部37は、ステップS54の処理において、ユーザ認証に成功したか否かを判断する。このステップS60の判断が肯定された場合には、ステップS62に移行し、制御部37は、使用制限情報(一部制限)を生成する。ステップS62の後は、制御部37は、ステップS66において、通信部36を介して使用制限情報(一部制限)をPC20に送信し、図6の全処理を終了する。
When the process proceeds to step S60, the control unit 37 determines whether or not the user authentication is successful in the process of step S54. When the determination in step S60 is affirmed, the process proceeds to step S62, and the control unit 37 generates use restriction information (partial restriction). After step S62, the control unit 37 transmits use restriction information (partial restriction) to the PC 20 via the communication unit 36 in step S66, and the whole process of FIG.
これに対し、ステップS60の判断が否定された場合、すなわち、ユーザ認証に失敗した場合には、ステップS64に移行する。なお、ステップS64に移行する場合とは、眼鏡型装置10におけるユーザ認証が行われておらず、かつ、腕時計型装置30においてユーザ認証に失敗した場合を意味する。このため、ステップS64では、制御部37は、使用制限情報(全部制限)を生成する。そして、制御部37は、ステップS66において、通信部36を介して使用制限情報(全部制限)をPC20に送信し、図6の全処理を終了する。
On the other hand, if the determination in step S60 is negative, that is, if user authentication fails, the process proceeds to step S64. In addition, the case where it transfers to step S64 means the case where the user authentication in the spectacles type apparatus 10 is not performed, and the user authentication fails in the wristwatch type apparatus 30. For this reason, in step S64, the control unit 37 generates use restriction information (all restrictions). And the control part 37 transmits use restriction information (all restrictions) to PC20 via the communication part 36 in step S66, and complete | finishes all the processes of FIG.
(PC20による使用制限情報利用処理)
図7は、PC20による使用制限情報利用処理の一例を示すフローチャートである。図7の処理では、ステップS14において、制御部26が、使用制限情報を受信するまで待機する。すなわち、眼鏡型装置10において図5のステップS40が実行されたタイミング、又は腕時計型装置30において図6のステップS66が実行されたタイミングで、制御部26は、ステップS16に移行する。 (Use restriction information use processing by PC 20)
FIG. 7 is a flowchart illustrating an example of usage restriction information utilization processing by thePC 20. In the process of FIG. 7, in step S <b> 14, the control unit 26 stands by until receiving use restriction information. That is, the control unit 26 proceeds to step S16 at the timing when step S40 of FIG. 5 is executed in the eyeglass-type device 10 or when step S66 of FIG. 6 is executed in the wristwatch-type device 30.
図7は、PC20による使用制限情報利用処理の一例を示すフローチャートである。図7の処理では、ステップS14において、制御部26が、使用制限情報を受信するまで待機する。すなわち、眼鏡型装置10において図5のステップS40が実行されたタイミング、又は腕時計型装置30において図6のステップS66が実行されたタイミングで、制御部26は、ステップS16に移行する。 (Use restriction information use processing by PC 20)
FIG. 7 is a flowchart illustrating an example of usage restriction information utilization processing by the
ステップS16に移行すると、制御部26は、使用制限情報に基づく処理を実行する。例えば、ユーザがネットバンキングのサイトにアクセスしたときに使用制限情報(なし)を受信した場合には、制御部26は、ネットバンキングのサイトを管理するサーバに対して使用制限情報(なし)を送信する。この場合、サーバは、使用制限情報(なし)に基づいて、ユーザの利用金額の制限をなくす。これにより、ユーザは、ネットバンキングのサイトにおいて、金額制限なく振り込み等を行うことができるようになる。また、ユーザがネットバンキングのサイトにアクセスしたときに使用制限情報(一部制限)を受信した場合には、制御部26は、ネットバンキングのサイトを管理するサーバに対して使用制限情報(一部制限)を送信する。この場合、サーバは、使用制限情報(一部制限)に基づいて、ユーザの利用金額に制限を設ける。これにより、ユーザは、ネットバンキングのサイトにおいて、所定の金額制限の下、振り込み等を行うことができるようになる。また、ユーザがネットバンキングのサイトにアクセスしたときに使用制限情報(全部制限)を受信した場合には、制御部26は、ネットバンキングのサイトを管理するサーバに対して使用制限情報(全部制限)を送信する。この場合、サーバは、使用制限情報(全部制限)に基づいて、ユーザの利用を禁止する。これにより、ユーザは、ネットバンキングのサイトにおいて、振り込み等を行うことができなくなる。なお、クレジットカード決済時においても、上記と同様の処理が行われる。
When the process proceeds to step S16, the control unit 26 executes a process based on the use restriction information. For example, when the use restriction information (none) is received when the user accesses the net banking site, the control unit 26 transmits the use restriction information (none) to the server that manages the net banking site. To do. In this case, the server removes the restriction on the usage amount of the user based on the usage restriction information (none). As a result, the user can make a transfer or the like on the net banking site without any amount limitation. In addition, when the use restriction information (partial restriction) is received when the user accesses the net banking site, the control unit 26 uses the use restriction information (partial restriction) to the server that manages the net banking site. Limit). In this case, the server sets a limit on the usage amount of the user based on the usage restriction information (partial restriction). As a result, the user can make a transfer or the like at a net banking site under a predetermined amount limit. Further, when the use restriction information (all restrictions) is received when the user accesses the net banking site, the control unit 26 uses the use restriction information (all restrictions) for the server that manages the net banking site. Send. In this case, the server prohibits the use of the user based on the use restriction information (all restrictions). As a result, the user cannot perform transfer or the like on the net banking site. Note that the same processing as described above is performed at the time of credit card settlement.
また、例えば、機密度が設定された文書ファイルにユーザがアクセスしたときに使用制限情報(なし)を受信した場合には、制御部26は、ユーザからの文書ファイルへのアクセスを許可する。また、機密度が設定された文書ファイルにユーザがアクセスしたときに使用制限情報(一部制限)を受信した場合には、制御部26は、文書ファイルに設定されている機密度の高低(レベル)に基づいて、ユーザからの文書ファイルへのアクセスを許可する。更に、機密度が設定された文書ファイルにユーザがアクセスしたときに使用制限情報(全部制限)を受信した場合には、制御部26は、機密度が設定された文書ファイルへのユーザからのアクセスを禁止する。
Further, for example, when the use restriction information (none) is received when the user accesses the document file in which the confidentiality is set, the control unit 26 permits the user to access the document file. In addition, when the use restriction information (partial restriction) is received when the user accesses the document file in which the confidentiality is set, the control unit 26 determines whether the confidentiality level (level) set in the document file is high. ) To allow the user access to the document file. Further, when the use restriction information (all restrictions) is received when the user accesses the document file with the confidentiality set, the control unit 26 accesses the document file with the confidentiality set from the user. Is prohibited.
以上説明したように、本第1の実施形態では、眼鏡型装置10の制御部17は、ユーザの網膜を利用したユーザ認証結果と、ユーザの静脈を利用したユーザ認証結果に基づいて、ユーザの操作に関する設定情報(使用制限情報)を生成する(S32、S36、S38)。これにより、2つのユーザ認証結果に基づく適切な使用制限情報を生成することが可能であり、適切な使用制限情報を用いることで装置のセキュリティを向上することが可能となる。
As described above, in the first embodiment, the control unit 17 of the eyeglass-type device 10 performs the user authentication based on the user authentication result using the user's retina and the user authentication result using the user's vein. Setting information (usage restriction information) related to the operation is generated (S32, S36, S38). Accordingly, it is possible to generate appropriate use restriction information based on two user authentication results, and it is possible to improve the security of the apparatus by using the appropriate use restriction information.
また、本第1の実施形態では、眼鏡型装置10は、PC20に対して設定情報(使用制限情報)を送信する通信部16を備えているので、PC20におけるユーザの操作設定を2つのユーザ認証結果に基づいて適切に行うことが可能である。例えば、PC20を利用する正規ユーザでないユーザが眼鏡型装置10及び腕時計型装置30の少なくとも一方を装着している場合には、利用制限をかけることができる。また、例えば、PC20を利用するユーザが正規ユーザであっても、眼鏡型装置10及び腕時計型装置30の少なくとも一方を装着していなければ、利用制限がかけられることになる。このように、本実施形態では、PC20を利用する正規のユーザが眼鏡型装置10及び腕時計型装置30の両方を装着していなければ、利用制限がかけられるため、セキュリティを向上することができる。このため、ユーザの口座の暗証番号などを入手してユーザになりすました人物が、ユーザの口座から預金を盗みとることを防止することができる。
In the first embodiment, the eyeglass-type device 10 includes the communication unit 16 that transmits setting information (usage restriction information) to the PC 20. It is possible to appropriately perform based on the result. For example, when a user who is not an authorized user who uses the PC 20 wears at least one of the eyeglass-type device 10 and the wristwatch-type device 30, the use can be restricted. For example, even if the user who uses the PC 20 is a regular user, use restriction is imposed unless at least one of the eyeglass-type device 10 and the wristwatch-type device 30 is worn. Thus, in this embodiment, if a legitimate user who uses the PC 20 does not wear both the eyeglass-type device 10 and the wristwatch-type device 30, usage restrictions are imposed, and thus security can be improved. For this reason, it is possible to prevent a person who has obtained a personal identification number of the user's account and impersonated the user from stealing a deposit from the user's account.
また、本第1の実施形態では、眼鏡型装置10の通信部16は、PC20に対して、人体通信又は近接通信により使用制限情報を送信するので、制御部17は、PC20の近傍にいるユーザ、すなわちPC20を実際に利用している可能性の高いユーザのユーザ認証情報を取得することができる。
In the first embodiment, the communication unit 16 of the eyeglass-type device 10 transmits use restriction information to the PC 20 by human body communication or proximity communication, so that the control unit 17 is a user in the vicinity of the PC 20. That is, user authentication information of a user who is likely to actually use the PC 20 can be acquired.
また、本実施形態では、腕時計型装置30の制御部37が、ユーザの体の一部(腕の静脈)の認証結果を取得し、通信部36を介して人体通信により眼鏡型装置10に送信する。これにより、身に着けている装置間において、ユーザ認証結果のやり取りを行うことができる。
In the present embodiment, the control unit 37 of the wristwatch-type device 30 acquires an authentication result of a part of the user's body (arm vein) and transmits it to the eyeglass-type device 10 by human body communication via the communication unit 36. To do. Thereby, user authentication results can be exchanged between wearing devices.
なお、上記第1の実施形態では、眼鏡型装置10や腕時計型装置30は、2つのユーザ認証結果に基づいてPC20におけるユーザの操作設定を行う使用制限情報を生成する場合について説明したが、これに限られるものではない。例えば、眼鏡型装置10や腕時計型装置30は、眼鏡型装置10や腕時計型装置30におけるユーザの操作設定を行うための使用制限情報を生成することとしてもよい。これにより、眼鏡型装置10や腕時計型装置30におけるセキュリティを向上することができる。
In the first embodiment, the spectacle-type device 10 and the wristwatch-type device 30 have been described as generating use restriction information for performing user operation settings on the PC 20 based on two user authentication results. It is not limited to. For example, the eyeglass-type device 10 or the wristwatch-type device 30 may generate use restriction information for performing user operation settings in the eyeglass-type device 10 or the wristwatch-type device 30. Thereby, the security in the glasses-type device 10 and the wristwatch-type device 30 can be improved.
なお、上記第1の実施形態では、眼鏡型装置10におけるユーザ認証を網膜情報を用いて行うこととしたが、これに限らず、ユーザの目に関する認証においては、虹彩情報を用いることとしてもよい。なお、虹彩認証は、例えば特開2013-148961号公報(特許第5360931号)に開示されている。また、上記第1の実施形態では、腕時計型装置30におけるユーザ認証を静脈情報を用いて行うこととしたが、これに限らず、ユーザの手に関する認証においては、指紋情報を用いることとしてもよい。また、ユーザの認証方法については、目や手に限らず、ユーザのその他の部位を用いた認証を採用してもよい。
In the first embodiment, the user authentication in the eyeglass-type device 10 is performed using the retina information. However, the present invention is not limited to this, and iris information may be used in the authentication related to the user's eyes. . Note that iris authentication is disclosed, for example, in Japanese Patent Application Laid-Open No. 2013-148961 (Patent No. 5360931). In the first embodiment, user authentication in the wristwatch-type device 30 is performed using vein information. However, the present invention is not limited to this, and fingerprint information may be used in authentication related to a user's hand. . Further, the user authentication method is not limited to eyes and hands, and authentication using other parts of the user may be employed.
なお、上記第1の実施形態では、眼鏡型装置10および腕時計型装置30を用いたユーザ認証処理を、ユーザがPC20の表示部21を見たタイミングで実行することとしてもよい。この場合、制御部17は、撮像部11により表示部21の画面が撮像されたことを撮像画像から認識した場合に、図5の処理を開始すると共に、通信部16、36を介して腕時計型装置30の制御部37に通知するようにすればよい。そして、腕時計型装置30の制御部37は、通知があったタイミングで、図6の処理を実行するようにすればよい。なお、制御部17は、撮像部11により表示部21の画面が撮像されたか否かを、撮像画像に表示部21の特徴部分が含まれているか否か、あるいは表示部21の画面に表示される画像の特徴部分が含まれているか否かなどにより判断するようにすればよい。
In the first embodiment, the user authentication process using the eyeglass-type device 10 and the wristwatch-type device 30 may be executed when the user views the display unit 21 of the PC 20. In this case, when the control unit 17 recognizes from the captured image that the screen of the display unit 21 has been captured by the imaging unit 11, the control unit 17 starts the processing of FIG. The control unit 37 of the device 30 may be notified. And the control part 37 of the wristwatch type | mold apparatus 30 should just perform the process of FIG. 6 at the timing which notified. The control unit 17 displays whether or not the screen of the display unit 21 has been captured by the imaging unit 11, whether or not the characteristic portion of the display unit 21 is included in the captured image, or is displayed on the screen of the display unit 21. Judgment may be made based on whether or not a characteristic portion of the image is included.
なお、上記第1の実施形態では、眼鏡型装置10に対して腕時計型装置30がユーザ認証結果を送信する場合(S56)について説明したが、これに限られるものではない。例えば、腕時計型装置30に対して、眼鏡型装置10がユーザ認証結果を送信するようにしてもよい。
In the first embodiment, the case where the wristwatch-type device 30 transmits the user authentication result to the eyeglass-type device 10 (S56) has been described. However, the present invention is not limited to this. For example, the eyeglass-type device 10 may transmit a user authentication result to the wristwatch-type device 30.
また、上記第1の実施形態では、眼鏡型装置10のユーザ認証結果と腕時計型装置30のユーザ認証結果とに基づいて生成される使用制限情報をPC20に送信する場合について説明したが、これに限られるものではない。例えば、眼鏡型装置10のユーザ認証結果と腕時計型装置30のユーザ認証結果とをPC20に送信することとしてもよい。この場合、PC20の制御部26が、両ユーザ認証結果に基づいて、使用制限情報を生成するようにすればよい。
In the first embodiment, the case where the use restriction information generated based on the user authentication result of the eyeglass-type device 10 and the user authentication result of the wristwatch-type device 30 is transmitted to the PC 20 has been described. It is not limited. For example, the user authentication result of the eyeglass-type device 10 and the user authentication result of the wristwatch-type device 30 may be transmitted to the PC 20. In this case, the control unit 26 of the PC 20 may generate usage restriction information based on both user authentication results.
なお、上記第1の実施形態では、人が装着する装置として、眼鏡型装置10及び腕時計型装置30採用したが、これに限らず、コンタクトレンズ型端末のようにユーザの体のその他の位置に装着される装置であってもよい。
In the first embodiment, the eyeglass-type device 10 and the wristwatch-type device 30 are adopted as devices worn by a person. However, the present invention is not limited to this, and other devices such as a contact lens type terminal may be used. It may be a device to be mounted.
なお、上記第1の実施形態では、使用制限情報を受信する機器がPC20である場合について説明したが、これに限られるものではない。例えば、撮像装置などのその他の機器であってもよい。
In the first embodiment, the case where the device that receives the use restriction information is the PC 20 has been described. However, the present invention is not limited to this. For example, other devices such as an imaging device may be used.
《第2の実施形態》
次に、第2の実施形態について、図9~図11(b)に基づいて詳細に説明する。図9には、第2の実施形態に係る電子機器システム100’の構成がブロック図にて示されている。 << Second Embodiment >>
Next, the second embodiment will be described in detail with reference to FIGS. 9 to 11B. FIG. 9 is a block diagram showing the configuration of anelectronic device system 100 ′ according to the second embodiment.
次に、第2の実施形態について、図9~図11(b)に基づいて詳細に説明する。図9には、第2の実施形態に係る電子機器システム100’の構成がブロック図にて示されている。 << Second Embodiment >>
Next, the second embodiment will be described in detail with reference to FIGS. 9 to 11B. FIG. 9 is a block diagram showing the configuration of an
図9と図1とを比較すると分かるように、本第2の実施形態の電子機器システム100’は、眼鏡型装置10が、画像認識部101と、OCR(Optical Character Recognition)部102と、を備えている点が第1の実施形態の電子機器システム100と異なる。なお、画像認識部101とOCR部102との少なくとも一方をPC20(タブレット端末や、スマートフォンでもよい)側に設け、人体通信や近接通信によりその結果を眼鏡型装置10が受信するようにしてもよい。
As can be seen from a comparison between FIG. 9 and FIG. 1, in the electronic device system 100 ′ of the second embodiment, the eyeglass-type device 10 includes an image recognition unit 101 and an OCR (OpticalOptCharacter Recognition) unit 102. It is different from the electronic device system 100 of the first embodiment in that it is provided. Note that at least one of the image recognition unit 101 and the OCR unit 102 may be provided on the PC 20 (which may be a tablet terminal or a smartphone), and the eyeglass device 10 may receive the result by human body communication or proximity communication. .
画像認識部101は、眼鏡型装置10の撮像部11が撮像した画像データの特徴点を、例えばSURF(Speeded Up Robust Features)やSIFT(Scale Invariant Feature Transform)といった特徴量検出アルゴリズムにより抽出し、記憶部15に記憶されている画像と比較する。本実施形態においては、例えば、ユーザが閲覧した(撮像部11が撮像した)PC20の表示部21に表示されたWebサイトの画像と、過去に閲覧した(過去に撮像部11が撮像した)Webサイトの画像(閲覧画像)とを比較する。
The image recognition unit 101 extracts feature points of image data captured by the imaging unit 11 of the eyeglass-type device 10 using, for example, a feature amount detection algorithm such as SURF (Speeded Up Up Robust Features) or SIFT (Scale Invariant Feature Up Transform) and stores the feature points. The image is compared with the image stored in the unit 15. In the present embodiment, for example, an image of a website displayed on the display unit 21 of the PC 20 browsed by the user (captured by the imaging unit 11) and a Web browsed in the past (captured by the imaging unit 11 in the past). Compare the images on the site (viewed images).
OCR部102は、眼鏡型装置10の撮像部11が撮像した画像データに含まれる文字を認識して、テキストデータに変換する。本実施形態においては、一例として、撮像部11が撮像したWebサイトの画像に含まれるURLを文字認識してテキストデータに変換する。
The OCR unit 102 recognizes characters included in the image data captured by the imaging unit 11 of the glasses-type device 10 and converts them into text data. In the present embodiment, as an example, a URL included in an image of a website imaged by the imaging unit 11 is recognized and converted into text data.
次に、本第2の実施形態の眼鏡型装置10による、フィッシング詐欺対策処理について図10のフローチャートに沿って、その他図面を適宜参照しつつ詳細に説明する。
Next, phishing countermeasure processing by the eyeglass-type device 10 of the second embodiment will be described in detail along the flowchart of FIG. 10 with reference to other drawings as appropriate.
図10の処理は、ユーザが、操作部13を介して、フィッシング詐欺対策処理の実行を指示した場合等において実行される処理である。図10の処理では、ステップS70において、制御部17が、撮像部11の撮像画像を取得する。なお、撮像部11は、静止画を所定時間間隔で撮像するものとする。
The process of FIG. 10 is a process executed when the user instructs execution of a phishing countermeasure process through the operation unit 13. In the process of FIG. 10, in step S <b> 70, the control unit 17 acquires a captured image of the imaging unit 11. Note that the imaging unit 11 captures still images at predetermined time intervals.
次いで、ステップS72では、制御部17は、画像認識部101に指示を出し、撮像部11の撮像画像に基づいて、ユーザが表示部21においてブラウザを見たか否かを判断する。この場合、画像認識部101は、例えば、撮像画像を参照して、ウィンドウの左上隅に付されているマーク(図11(a)の場合“※”マーク)がブラウザ特有のマークであるか否かを判定し、ブラウザ特有のマークであった場合に、ユーザがブラウザを見たと判断する。
Next, in step S72, the control unit 17 issues an instruction to the image recognition unit 101, and determines whether or not the user has viewed the browser on the display unit 21 based on the captured image of the imaging unit 11. In this case, the image recognition unit 101 refers to, for example, the captured image and determines whether the mark attached to the upper left corner of the window (“*” mark in FIG. 11A) is a browser-specific mark. If the mark is unique to the browser, it is determined that the user has viewed the browser.
ステップS72の判断が否定された場合には、ステップS70に戻るが、肯定された場合には、ステップS74に移行する。
If the determination in step S72 is negative, the process returns to step S70, but if the determination is positive, the process proceeds to step S74.
ステップS74に移行すると、制御部17は、撮像画像を一時記憶する。次いで、ステップS76では、制御部17は、画像認識部101に指示を出し、撮像画像と、記憶部15に記憶されている過去の閲覧画像とを比較する。例えば、撮像部11が今回撮像した画像が、図11(a)に示す画像であったとする。この場合、画像認識部101は、図11(a)に示す画像と、記憶部15に記憶されている過去の閲覧画像とをそれぞれ比較し、各閲覧画像との類似度を算出する。
When the process proceeds to step S74, the control unit 17 temporarily stores the captured image. Next, in step S <b> 76, the control unit 17 issues an instruction to the image recognition unit 101, and compares the captured image with a past browsing image stored in the storage unit 15. For example, it is assumed that the image captured by the imaging unit 11 this time is the image illustrated in FIG. In this case, the image recognition unit 101 compares the image shown in FIG. 11A with the past browsing images stored in the storage unit 15, and calculates the similarity with each browsing image.
次いで、ステップS78では、制御部17は、類似度が閾値以上の閲覧画像があったか否かを判断する。閾値としては、例えば80%から90%などの数値を採用することができる。ステップS78の判断が否定された場合、すなわち、類似度が閾値以上の閲覧画像がなかった場合には、ステップS80に移行し、制御部17は一時記憶していた撮像画像を、過去の閲覧画像として記憶部15に記憶する。その後は、ステップS70に戻る。
Next, in step S78, the control unit 17 determines whether there is a browse image having a similarity equal to or greater than a threshold value. As the threshold value, for example, a numerical value such as 80% to 90% can be adopted. If the determination in step S78 is negative, that is, if there is no browse image with a similarity equal to or greater than the threshold, the process proceeds to step S80, and the control unit 17 converts the captured image stored temporarily into the past browse image. Is stored in the storage unit 15. Thereafter, the process returns to step S70.
なお、図11(a)に示す今回撮像した画像は、図11(b)に示すような過去の閲覧画像(“インターネット銀行”のWebサイトを閲覧したときの画像)との類似度が閾値以上であったとする。この場合、ステップS78の判断は肯定され、制御部17は、ステップS82に移行する。
In addition, the image captured this time shown in FIG. 11A has a similarity with a past browsing image as shown in FIG. 11B (an image when browsing the website of “Internet bank”) or more than a threshold value. Suppose that In this case, the determination in step S78 is positive, and the control unit 17 proceeds to step S82.
ステップS82に移行すると、制御部17は、OCR部102を利用して、撮像画像に含まれるURLと類似する閲覧画像のURLとを比較する。なお、制御部17は、ブラウザのどの位置にWebサイトのURLが記載されているかを認識しているものとする。したがって、制御部17は、URLが記載されている範囲を撮像画像及び閲覧画像から抽出し、OCR部102にテキスト化させる。そして、制御部17は、テキスト化したURL同士を比較する。なお、制御部17は、画像認識部101を用いて、URL部分の画像同士を比較することとしてもよい。
In step S82, the control unit 17 uses the OCR unit 102 to compare the URL included in the captured image with the URL of the browsing image. It is assumed that the control unit 17 recognizes where the URL of the Web site is written in the browser. Therefore, the control unit 17 extracts the range in which the URL is described from the captured image and the browse image, and causes the OCR unit 102 to convert the text. Then, the control unit 17 compares the URLs converted into text. Note that the control unit 17 may compare the images in the URL portion using the image recognition unit 101.
次いで、ステップS84では、制御部17は、テキスト化したURL同士を比較した結果、同一であったか否かを判断する。図11(a)、図11(b)の場合、URLが、“.net”、“.jp”の部分において異なるので、ステップS84の判断は否定され、ステップS86に移行する。なお、撮像画像と過去の閲覧画像とが類似するにもかかわらず、URLが異なっている場合、撮像画像に含まれるWebサイト(ユーザが閲覧しているWebサイト)がフィッシング詐欺を目的としたWebサイトである可能性がある。
Next, in step S84, the control unit 17 determines whether or not the URLs are the same as a result of comparing the text URLs. In the case of FIG. 11A and FIG. 11B, since the URLs are different in the portions “.net” and “.jp”, the determination in step S84 is denied and the process proceeds to step S86. When the captured image and the past browsing image are similar, but the URL is different, the Web site included in the captured image (the Web site being browsed by the user) is a Web aimed at phishing It may be a site.
ステップS86に移行すると、制御部17は、表示部12を利用して、警告を発する。例えば、制御部17は、表示部12上に“このサイトはフィッシング詐欺のサイトである可能性があります”などと表示する。なお、制御部17は、不図示のスピーカを用いて、警告を発することとしてもよい。あるいは、制御部17は、警告情報をPC20の制御部26や腕時計型装置30の制御部37に送信することとしてもよい。これにより、制御部26や制御部37は、PC20の表示部21や腕時計型装置30の表示部31等を用いて警告を発することができる。なお、眼鏡型装置10や腕時計型装置30が振動部(バイブレーター)を有している場合には、振動部を振動させることにより、警告を発するようにしてもよい。ステップS86の処理が行われた後は、ステップS70に戻る。
When the process proceeds to step S86, the control unit 17 issues a warning using the display unit 12. For example, the control unit 17 displays “This site may be a phishing site” on the display unit 12. In addition, the control part 17 is good also as issuing a warning using a speaker not shown. Alternatively, the control unit 17 may transmit the warning information to the control unit 26 of the PC 20 or the control unit 37 of the wristwatch type device 30. Thereby, the control part 26 and the control part 37 can issue a warning using the display part 21 of PC20, the display part 31 of the wristwatch-type apparatus 30, etc. FIG. In the case where the glasses-type device 10 or the wristwatch-type device 30 has a vibration part (vibrator), a warning may be issued by vibrating the vibration part. After the process of step S86 is performed, the process returns to step S70.
一方、ステップS84の判断が肯定された場合、すなわち、URLを比較した結果、同一であった場合には、ステップS86を経ずに、ステップS70に戻る。
On the other hand, if the determination in step S84 is affirmative, that is, if the result of comparing the URLs is the same, the process returns to step S70 without passing through step S86.
以上説明したように、本第2の実施形態によると、制御部17は、撮像部11が撮像した撮像画像に含まれるURLと、記憶部15に記憶されている過去の閲覧画像に含まれるURLとを比較して、撮像部11が過去の閲覧画像と関連する情報を撮像したかどうか(ユーザが見たWebサイトと過去の閲覧画像に含まれるWebサイトとが同一か否か)を判断し、判断結果に応じて、ユーザに警告をする。これにより、ユーザが、一見すると過去に閲覧したWebサイトと似ている偽のWebサイト(フィッシング詐欺のサイト等)にアクセスしてしまうのを、警告によって抑止することができる。
As described above, according to the second embodiment, the control unit 17 includes the URL included in the captured image captured by the imaging unit 11 and the URL included in the past browsing image stored in the storage unit 15. And whether or not the imaging unit 11 has captured information related to the past browsing image (whether the Web site viewed by the user and the Web site included in the past browsing image are the same) is determined. The user is warned according to the determination result. Thereby, it is possible to prevent the user from accessing a fake website (such as a phishing scam site) similar to a website browsed in the past at first glance.
また、本第2の実施形態では、制御部17は、撮像部11を用いて撮像した撮像画像を適宜記憶部15に記憶する(S80)。これにより、過去の閲覧画像となる撮像画像を記憶部15に追加していくことが可能となる。
In the second embodiment, the control unit 17 appropriately stores the captured image captured using the imaging unit 11 in the storage unit 15 (S80). Thereby, it becomes possible to add the captured image used as the past browsing image to the memory | storage part 15. FIG.
また、本第2の実施形態では、制御部17は、ユーザが見たWebサイトと過去の閲覧画像に含まれるWebサイトとが同一か否かの判断結果をPC20に対して送信してPC20上でユーザに警告する。これにより、PC20を利用するユーザに対して適切な位置に分かり易く警告表示することが可能である。
In the second embodiment, the control unit 17 transmits to the PC 20 a determination result as to whether or not the Web site viewed by the user is the same as the Web site included in the past browsing image, and the PC 20 To warn the user. Thereby, it is possible to display a warning in an easy-to-understand manner at an appropriate position for the user who uses the PC 20.
なお、上記第2の実施形態では、過去の閲覧画像に含まれるURLについても、撮像画像に含まれるURLをテキスト化するタイミングと同一のタイミング(S82)でテキスト化し、比較する場合について説明したが、これに限られるものではない。例えば、過去の閲覧画像を記憶部15に記憶するタイミング(S80)で、URLをテキスト化し、過去の閲覧画像と対応付けて記憶部15に記憶しておいてもよい。
In the second embodiment, a case has been described in which URLs included in past browsing images are converted to text at the same timing (S82) as the text conversion of the URL included in the captured image. However, it is not limited to this. For example, the URL may be converted into text at the timing (S80) when the past browsing image is stored in the storage unit 15, and stored in the storage unit 15 in association with the past browsing image.
なお、上記第2の実施形態では、眼鏡型装置10の制御部17が画像の比較やURLの比較を行う場合について説明したが、これに限られるものではない。例えば、画像やURLの比較は、PC20、腕時計型装置30、あるいはPC20に接続されたサーバ(クラウドサーバ)が行うこととしてもよい。この場合、眼鏡型装置10の制御部17は、撮像部11で撮像された画像を随時PC20や腕時計型装置30、あるいはサーバに送信するようにすればよい。
In the second embodiment, the case where the control unit 17 of the glasses-type device 10 performs image comparison and URL comparison has been described. However, the present invention is not limited to this. For example, the comparison of images and URLs may be performed by the PC 20, the wristwatch type device 30, or a server (cloud server) connected to the PC 20. In this case, the control unit 17 of the glasses-type device 10 may transmit the image captured by the imaging unit 11 to the PC 20, the wristwatch-type device 30, or the server as needed.
なお、上記第2の実施形態では、眼鏡型装置10の制御部17が、撮像部11が撮像した撮像画像に含まれるURLと、記憶部15に記憶されている過去の閲覧画像に含まれるURLとを比較して、撮像部11が過去の閲覧画像と関連する情報を撮像したかどうかを判断する場合について説明したが、これに限られるものではない。例えば、眼鏡型装置10の制御部17は、インターネット等を用いて所定のWebサーバ等と通信し、撮像部11が撮像した撮像画像に含まれるURLを検索し、該URLの真偽(偽のWebサイトのURLでないか)を判断し、判断結果をユーザに対して出力(報知)するようにしてもよい。これにより、記憶部15に過去の閲覧画像に含まれるURLを記憶しておかなくてもよくなるので、処理の簡素化、記憶部15に記憶するデータ量の低減を図ることが可能となる。
In the second embodiment, the control unit 17 of the eyeglass-type device 10 includes the URL included in the captured image captured by the imaging unit 11 and the URL included in the past browsing image stored in the storage unit 15. However, the present invention is not limited to this, but the image capturing unit 11 determines whether the information related to the past browsing image is captured. For example, the control unit 17 of the eyeglass-type device 10 communicates with a predetermined Web server or the like using the Internet or the like, searches for a URL included in the captured image captured by the imaging unit 11, and authenticates the false (false) It may be determined whether the URL is a website URL, and the determination result may be output (notified) to the user. This eliminates the need to store URLs included in past browsing images in the storage unit 15, thereby simplifying the process and reducing the amount of data stored in the storage unit 15.
なお、上記第2の実施形態では、眼鏡型装置10に代えて、頭近傍に装着することが可能なその他の装着装置を採用することとしてもよい。
In the second embodiment, instead of the eyeglass-type device 10, other mounting devices that can be mounted near the head may be employed.
なお、眼鏡型装置10や腕時計型装置30の機能の一部が分離していてもよい。一例を挙げると、表示部を備えたコンタクトレンズ型の端末として、他の機能を眼鏡型装置又は腕時計型装置に持たせるようにしてもよい。
Note that some of the functions of the glasses-type device 10 and the wristwatch-type device 30 may be separated. For example, as a contact lens type terminal provided with a display unit, another function may be given to a spectacle type device or a wristwatch type device.
なお、上記の処理機能は、コンピュータによって実現することができる。その場合、処理装置(CPU)が有すべき機能の処理内容を記述したプログラムが提供される。そのプログラムをコンピュータで実行することにより、上記処理機能がコンピュータ上で実現される。処理内容を記述したプログラムは、コンピュータで読み取り可能な記録媒体(ただし、搬送波は除く)に記録しておくことができる。
Note that the above processing functions can be realized by a computer. In that case, a program describing the processing contents of the functions that the processing device (CPU) should have is provided. By executing the program on a computer, the above processing functions are realized on the computer. The program describing the processing contents can be recorded on a computer-readable recording medium (except for a carrier wave).
プログラムを流通させる場合には、例えば、そのプログラムが記録されたDVD(Digital Versatile Disc)、CD-ROM(Compact Disc Read Only Memory)などの可搬型記録媒体の形態で販売される。また、プログラムをサーバコンピュータの記憶装置に格納しておき、ネットワークを介して、サーバコンピュータから他のコンピュータにそのプログラムを転送することもできる。
When the program is distributed, for example, it is sold in the form of a portable recording medium such as a DVD (Digital Versatile Disc) or CD-ROM (Compact Disc Read Only Memory) on which the program is recorded. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
プログラムを実行するコンピュータは、例えば、可搬型記録媒体に記録されたプログラムもしくはサーバコンピュータから転送されたプログラムを、自己の記憶装置に格納する。そして、コンピュータは、自己の記憶装置からプログラムを読み取り、プログラムに従った処理を実行する。なお、コンピュータは、可搬型記録媒体から直接プログラムを読み取り、そのプログラムに従った処理を実行することもできる。また、コンピュータは、サーバコンピュータからプログラムが転送されるごとに、逐次、受け取ったプログラムに従った処理を実行することもできる。
The computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.
上述した各実施形態は本発明の好適な実施の例であり、適宜組み合わせることができる。例えば、上記第1の実施形態の認証技術と、上記第2の実施形態のフィッシング詐欺対策技術とを組み合わせることとしてもよい。但し、これに限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々変形実施可能である。なお、これまでの説明で引用した公報、米国特許出願公開明細書等の開示を援用して本明細書の一部とする。
Each embodiment mentioned above is an example of suitable implementation of the present invention, and can be combined suitably. For example, the authentication technique of the first embodiment may be combined with the phishing countermeasure technique of the second embodiment. However, the present invention is not limited to this, and various modifications can be made without departing from the scope of the present invention. It should be noted that the disclosures of publications and US patent application publications cited in the above description are incorporated herein by reference.
10 眼鏡型装置
11 撮像部
15 記憶部
16 通信部
17 制御部
18 網膜情報取得部
20 PC
110 フレーム DESCRIPTION OFSYMBOLS 10 Glasses type apparatus 11 Imaging part 15 Storage part 16 Communication part 17 Control part 18 Retina information acquisition part 20 PC
110 frames
11 撮像部
15 記憶部
16 通信部
17 制御部
18 網膜情報取得部
20 PC
110 フレーム DESCRIPTION OF
110 frames
Claims (20)
- ユーザの第1部分の認証結果を入力する第1入力部と、
前記ユーザの第2部分の認証結果を入力する第2入力部と、
前記第1、第2入力部の入力情報に基づき、前記ユーザの操作に関する設定情報を生成する生成部と、を備えた電子機器。 A first input unit for inputting an authentication result of the first part of the user;
A second input unit for inputting an authentication result of the second part of the user;
An electronic apparatus comprising: a generation unit that generates setting information related to the user operation based on input information of the first and second input units. - 外部機器に対して、前記設定情報を送信する送信部を備えた請求項1記載の電子機器。 The electronic device according to claim 1, further comprising a transmission unit that transmits the setting information to an external device.
- 前記送信部は、近接通信もしくは前記ユーザを介した人体通信により前記設定情報を送信する請求項2記載の電子機器。 3. The electronic apparatus according to claim 2, wherein the transmission unit transmits the setting information by proximity communication or human body communication via the user.
- 前記設定情報を用いて、前記ユーザの操作に関する設定を行う設定部を備えた請求項1に記載の電子機器。 The electronic apparatus according to claim 1, further comprising a setting unit configured to perform settings related to the user's operation using the setting information.
- 前記設定部は、前記第1、第2入力部の入力結果に基づき、前記ユーザの操作に制限を設定する請求項4に記載の電子機器。 The electronic device according to claim 4, wherein the setting unit sets a restriction on the user's operation based on the input results of the first and second input units.
- 前記第1入力部は、前記ユーザの目に関する認証結果を入力する請求項1~5のいずれか一項に記載の電子機器。 The electronic device according to any one of claims 1 to 5, wherein the first input unit inputs an authentication result related to the eyes of the user.
- 前記第2入力部は、前記ユーザの手に関する認証結果を入力する請求項1~6のいずれか一項に記載の電子機器。 The electronic device according to any one of claims 1 to 6, wherein the second input unit inputs an authentication result regarding the user's hand.
- 前記ユーザの顔付近に装着するための装着部を備えた請求項1~7のいずれか一項に記載の電子機器。 The electronic device according to any one of claims 1 to 7, further comprising a mounting portion for mounting near the user's face.
- ユーザの体の一部の認証結果を入力する入力部と、
前記入力部の入力情報を人体通信により外部機器に送信する送信部と、を備えた電子機器。 An input unit for inputting an authentication result of a part of the user's body;
An electronic device comprising: a transmission unit that transmits input information of the input unit to an external device through human body communication. - 前記送信部が前記外部機器に前記入力情報を送信できなかった場合に、前記入力部が入力した前記認証結果に基づき、前記ユーザの操作に関する設定情報を生成する生成部を備えた請求項9に記載の電子機器。 10. The generation unit according to claim 9, further comprising: a generation unit configured to generate setting information related to the user operation based on the authentication result input by the input unit when the transmission unit cannot transmit the input information to the external device. The electronic device described.
- 情報を記憶する記憶部と、
撮像を行う撮像部と、
前記撮像部が前記記憶部の記憶する情報と関連する情報を撮像したかどうか判断する判断部と、
前記判断部の判断結果に応じて、ユーザに報知を行う報知部と、を備えた電子機器。 A storage unit for storing information;
An imaging unit for imaging;
A determination unit that determines whether the imaging unit has captured information related to information stored in the storage unit;
An electronic device comprising: a notification unit that notifies a user according to a determination result of the determination unit. - 前記撮像部を用いて撮像した情報を前記記憶部に記憶する請求項11記載の電子機器。 The electronic device according to claim 11, wherein information captured using the imaging unit is stored in the storage unit.
- 前記記憶部はテキストデータを記憶し、
前記判断部は、前記撮像部が撮像した撮像データに基づいて生成したテキストデータと、前記記憶部が記憶するテキストデータとに基づいた判断を行う請求項11又は12に記載の電子機器。 The storage unit stores text data,
The electronic device according to claim 11 or 12, wherein the determination unit performs determination based on text data generated based on imaging data captured by the imaging unit and text data stored in the storage unit. - 前記判断部の判断結果を外部機器に送信する送信部を備えた請求項11~13のいずれか一項に記載の電子機器。 14. The electronic device according to claim 11, further comprising a transmission unit that transmits a determination result of the determination unit to an external device.
- 前記送信部は、近接通信もしくは前記ユーザを介した人体通信により前記情報を送信する請求項14記載の電子機器。 15. The electronic apparatus according to claim 14, wherein the transmission unit transmits the information by proximity communication or human body communication via the user.
- 文字情報に関する撮像を行う撮像部と、
外部装置と通信することで、前記撮像部が撮像した前記文字情報の真偽を判断する判断部と、
前記判断部の判断結果に応じて、ユーザに報知を行う報知部と、を備えた電子機器。 An imaging unit that performs imaging related to character information;
A determination unit that determines the authenticity of the character information captured by the imaging unit by communicating with an external device;
An electronic device comprising: a notification unit that notifies a user according to a determination result of the determination unit. - ユーザの目に関する認証結果を入力する入力部と、
前記入力部の入力情報に基づき、前記ユーザの操作に関する設定情報を生成する生成部と、を備えた電子機器。 An input unit for inputting an authentication result related to the user's eyes;
An electronic apparatus comprising: a generation unit that generates setting information related to the user operation based on input information of the input unit. - 前記生成部は、前記入力部が入力した認証結果に基づいて、前記設定情報に制限を行う請求項17記載の電子機器。 The electronic device according to claim 17, wherein the generation unit limits the setting information based on an authentication result input by the input unit.
- ユーザの目に関する認証結果を入力する入力部と、
前記入力部の入力情報を人体通信により外部機器に送信する送信部と、を備えた電子機器。 An input unit for inputting an authentication result related to the user's eyes;
An electronic device comprising: a transmission unit that transmits input information of the input unit to an external device through human body communication. - 前記送信部は、前記ユーザによる前記外部機器の使用制限に関する情報を送信する請求項19記載の電子機器。
The electronic device according to claim 19, wherein the transmission unit transmits information related to use restriction of the external device by the user.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-269755 | 2013-12-26 | ||
JP2013269755 | 2013-12-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015098253A1 true WO2015098253A1 (en) | 2015-07-02 |
Family
ID=53478132
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/077616 WO2015098253A1 (en) | 2013-12-26 | 2014-10-16 | Electronic device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015098253A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020021811A1 (en) * | 2018-07-25 | 2020-01-30 | 日本電信電話株式会社 | Analysis device, analysis method, and analysis program |
WO2021204947A1 (en) * | 2020-04-08 | 2021-10-14 | Heba Bevan | Infection and disease sensing systems |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002312324A (en) * | 2001-04-13 | 2002-10-25 | Sony Corp | Wristband-type authentication device and system, and information output device |
JP2003058509A (en) * | 2001-08-15 | 2003-02-28 | Sony Corp | Identification processing system, identification processing method, identification device, and computer program |
JP2003060635A (en) * | 2001-08-13 | 2003-02-28 | Sony Corp | Personal authentication system, method therefor, authentication device and computer program |
JP2005528662A (en) * | 2001-08-28 | 2005-09-22 | ヒューレット・パッカード・カンパニー | User wearable wireless transaction device using biometric user verification |
JP2007179206A (en) * | 2005-12-27 | 2007-07-12 | Fujitsu Fip Corp | Information communication system, toolbar providing server, information providing server, unauthorized site detection method, and toolbar program |
JP2008067218A (en) * | 2006-09-08 | 2008-03-21 | Sony Corp | Imaging display device, and imaging display method |
JP2008198028A (en) * | 2007-02-14 | 2008-08-28 | Sony Corp | Wearable device, authentication method and program |
JP2010516007A (en) * | 2007-01-16 | 2010-05-13 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Method and apparatus for detecting computer fraud |
-
2014
- 2014-10-16 WO PCT/JP2014/077616 patent/WO2015098253A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002312324A (en) * | 2001-04-13 | 2002-10-25 | Sony Corp | Wristband-type authentication device and system, and information output device |
JP2003060635A (en) * | 2001-08-13 | 2003-02-28 | Sony Corp | Personal authentication system, method therefor, authentication device and computer program |
JP2003058509A (en) * | 2001-08-15 | 2003-02-28 | Sony Corp | Identification processing system, identification processing method, identification device, and computer program |
JP2005528662A (en) * | 2001-08-28 | 2005-09-22 | ヒューレット・パッカード・カンパニー | User wearable wireless transaction device using biometric user verification |
JP2007179206A (en) * | 2005-12-27 | 2007-07-12 | Fujitsu Fip Corp | Information communication system, toolbar providing server, information providing server, unauthorized site detection method, and toolbar program |
JP2008067218A (en) * | 2006-09-08 | 2008-03-21 | Sony Corp | Imaging display device, and imaging display method |
JP2010516007A (en) * | 2007-01-16 | 2010-05-13 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Method and apparatus for detecting computer fraud |
JP2008198028A (en) * | 2007-02-14 | 2008-08-28 | Sony Corp | Wearable device, authentication method and program |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020021811A1 (en) * | 2018-07-25 | 2020-01-30 | 日本電信電話株式会社 | Analysis device, analysis method, and analysis program |
JPWO2020021811A1 (en) * | 2018-07-25 | 2021-02-15 | 日本電信電話株式会社 | Analytical equipment, analysis method and analysis program |
WO2021204947A1 (en) * | 2020-04-08 | 2021-10-14 | Heba Bevan | Infection and disease sensing systems |
GB2611919A (en) * | 2020-04-08 | 2023-04-19 | Bevan Heba | Infection and disease sensing systems |
GB2611919B (en) * | 2020-04-08 | 2025-03-12 | Bevan Heba | Infection and disease sensing systems |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11100208B2 (en) | Electronic device and method for controlling the same | |
US10341113B2 (en) | Password management | |
Chen et al. | Eyetell: Video-assisted touchscreen keystroke inference from eye movements | |
US8963806B1 (en) | Device authentication | |
US10706396B2 (en) | Systems and methods for translating a gesture to initiate a financial transaction | |
US10621583B2 (en) | Wearable earpiece multifactorial biometric analysis system and method | |
Li et al. | Whose move is it anyway? Authenticating smart wearable devices using unique head movement patterns | |
Peng et al. | Continuous authentication with touch behavioral biometrics and voice on wearable glasses | |
US10205883B2 (en) | Display control method, terminal device, and storage medium | |
Neal et al. | Surveying biometric authentication for mobile device security | |
US20200026939A1 (en) | Electronic device and method for controlling the same | |
US10733275B1 (en) | Access control through head imaging and biometric authentication | |
US10803159B2 (en) | Electronic device and method for controlling the same | |
US20160042172A1 (en) | Method and apparatus for unlocking devices | |
JP2017527036A (en) | System and method for using eye signals in secure mobile communications | |
CN103870738A (en) | Wearable identity authentication device based on iris identification | |
US11080417B2 (en) | Private eye-to-eye communications with wearable heads up display | |
WO2021115424A1 (en) | Voice payment method and electronic device | |
KR102082418B1 (en) | Electronic device and method for controlling the same | |
JP6743883B2 (en) | Electronic device, authentication method and program | |
WO2015098253A1 (en) | Electronic device | |
KR20210130856A (en) | Electronic device and its control method | |
WO2015093221A1 (en) | Electronic device and program | |
JP6585518B2 (en) | Authentication system, method and program | |
Guo et al. | Shake, shake, i know who you are: Authentication through smart wearable devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14875011 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14875011 Country of ref document: EP Kind code of ref document: A1 |