+

WO2019039065A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2019039065A1
WO2019039065A1 PCT/JP2018/023548 JP2018023548W WO2019039065A1 WO 2019039065 A1 WO2019039065 A1 WO 2019039065A1 JP 2018023548 W JP2018023548 W JP 2018023548W WO 2019039065 A1 WO2019039065 A1 WO 2019039065A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
control unit
state
information processing
finger
Prior art date
Application number
PCT/JP2018/023548
Other languages
French (fr)
Japanese (ja)
Inventor
淳 入江
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019039065A1 publication Critical patent/WO2019039065A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • Patent Document 1 discloses an operation device that transmits an operation command based on an angle between fingers to a device when the user performs a gesture operation using a hand, that is, an input operation.
  • Patent Document 1 there is a case where an operation command not intended by the user is transmitted to the device. That is, there was a possibility that the input operation performed by the user could be erroneously recognized.
  • the present disclosure proposes a new and improved information processing apparatus, information processing method, and program that can more accurately determine an input operation performed by a user using a hand.
  • an information processing apparatus includes a control unit that determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range. .
  • an information processing method in which the processor determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
  • a program causes a computer to realize a control function of determining the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range. Be done.
  • control unit or the like determines the type of the input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
  • the control unit or the like determines the type of the input operation in consideration of the distance between the object and the operation site of the hand and the state of the hand. Therefore, the control unit or the like can more accurately determine the input operation of the user.
  • the above-mentioned effects are not necessarily limited, and, along with or in place of the above-mentioned effects, any of the effects shown in the present specification, or other effects that can be grasped from the present specification May be played.
  • the present inventor diligently studied a technique capable of more accurately determining the gesture operation performed by the user using a hand, that is, the input operation.
  • the inventor examined a technique for determining the type of input operation based on the distance between a real object and a hand operation part.
  • the real object is a portion where the user performs an input operation.
  • an image on which the user can perform an input operation can be displayed on the real object.
  • the operation site is a site where the user performs an input operation, and is, for example, a fingertip of the index finger.
  • the inventor examined a technique for determining that the input operation is a touch operation when the operation site of the hand contacts the real object. This is because it can be said that the influence of individual differences is small with regard to distance. That is, when the user wants to operate the operation target, it is assumed that the operation site of the hand approaches the operation target.
  • the present inventor has made further studies. As a result, the inventor has found that the user's input operation can be determined more accurately by focusing on the distance between the object and the operation part of the hand and the state of the hand and using these pieces of information. And the present inventors considered to this embodiment based on such knowledge. Hereinafter, the present embodiment will be described in detail.
  • the information processing apparatus 10 includes a storage unit 11, a display unit 12, a detection unit 13, a control unit 14, and a housing unit 15.
  • the information processing apparatus 10 is, for example, a projection apparatus.
  • the information processing apparatus 10 has a hardware configuration such as a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a non-volatile memory, a display device, a detection device, and a housing.
  • the ROM stores information necessary for the operation of the information processing apparatus 10, such as a program.
  • the CPU reads and executes the program stored in the ROM.
  • the display device projects an image on various real objects (for example, the table 100, the wall 200, and the like) to display the images on these real objects.
  • the detection device captures, for example, the user's hand and a real object.
  • a detection apparatus detects the state (for example, shape) and position of a hand of a user (person U shown in FIG.2 and FIG.3) based on a captured image. Furthermore, the detection device also detects the position, shape, etc. of the real object.
  • a detection apparatus is a depth sensor, for example, if it is an apparatus which can realize the above-mentioned function, it will not be limited to this.
  • the detection device may be a device capable of specifying the position of the three-dimensional feature point of the hand.
  • the housing incorporates each hardware configuration.
  • the storage unit 11, the display unit 12, the detection unit 13, the control unit 14, and the housing unit 15 are realized by these hardware configurations.
  • the information processing apparatus 10 may further include an operation button such as a power button. These operation buttons allow an input operation by the user.
  • the information processing device 10 may include a communication device or the like that can communicate with other information processing devices.
  • the storage unit 11 is configured by, for example, a ROM and stores various types of information.
  • the storage unit 11 stores an image displayed by the display unit 12, a program necessary for the processing of the information processing apparatus 10, and the like.
  • the display unit 12 includes, for example, a display device.
  • the display unit 12 displays an image on a real object by projecting the image on various real objects (for example, the table 100, the wall 200, and the like). In the example shown in FIGS. 2 and 3, the display unit 12 displays the image 12A on the real object.
  • the type of real object to be displayed by the display unit 12 is not particularly limited. Further, the type of image displayed by the display unit 12 is not particularly limited. For example, the display unit 12 may display an image to be subjected to an input operation by the user.
  • the detection unit 13 includes, for example, a detection device.
  • the detection unit 13 captures, for example, the user's hand and a real object. Then, the detection unit 13 detects the state and the position of the user's hand (for example, the right hand RH) based on the captured image.
  • the state of the hand includes, for example, the shape of the hand.
  • the position of the hand includes the position of the finger, the palm, and the like.
  • the detection unit 13 also detects the position, shape, and the like of the real object based on the captured image. Then, the detection unit 13 outputs detection information on the detection result to the control unit 14.
  • the control unit 14 is configured of, for example, a CPU (processor) or the like, and controls each component of the information processing apparatus 10. For example, the control unit 14 causes the display unit 12 to display various images. Furthermore, the control unit 14 recognizes the distance between the real object and the operation site of the hand based on the detection information.
  • the real object is an object on which an image is projected from the display unit 12 and is, for example, a table 100, a wall 200, and the like.
  • the operation site is a site where the user performs an input operation.
  • the control unit 14 includes one or more of any finger and palm included in the hand in the operation site. More specifically, the control unit 14 includes the index finger at the operation site. More specifically, the control unit 14 includes the fingertip of the index finger at the operation site. The fingertip is, for example, a site beyond the first joint. For example, the control unit 14 includes the fingertip RH21 of the forefinger RH2 of the right hand RH in the operation site.
  • the operation site is not limited to this, and may be the fingertip of another finger, palm, left hand or the like.
  • the operation site may be set in advance or may be set arbitrarily by the user. The modification of the operation part is mentioned later.
  • the control unit 14 determines the type of the input operation based on the state of the hand. For example, the control unit 14 determines whether the input operation is a touch operation based on the state of the hand. Thus, the control unit 14 can easily and accurately determine whether the user is performing a touch operation.
  • the touch operation is various operations performed by touching a real object, in other words, an image displayed on the real object. Examples of the touch operation include a selection operation (an operation of selecting a touched portion), a drawing operation (an operation of drawing a line on a movement locus by moving the touched operation portion), and the like. The specific determination method will be described later.
  • control part 14 performs processing according to input operation.
  • the housing unit 15 includes, for example, the above-described housing, and incorporates the hardware configuration, in other words, the storage unit 11, the display unit 12, the detection unit 13, and the control unit 14.
  • step S10 the display unit 12 displays an image on a real object.
  • the user brings the forefinger RH2 of the right hand RH close to the image, for example, to perform an input operation on the image.
  • the detection unit 13 captures an image of the user's hand and a real object (a real object on which an image is displayed).
  • the detection unit 13 captures an image of the user's right hand RH and a real object.
  • the detection unit 13 detects the state and the position of the user's hand based on the captured image. Furthermore, the detection unit 13 also detects the position and shape of the real object. Subsequently, the detection unit 13 outputs detection information on the detection result to the control unit 14.
  • the control unit 14 specifies the distance between the real object and the operation part of the hand based on the detection information.
  • the control unit 14 specifies the distance between the real object and the fingertip RH21.
  • the control unit 14 determines whether the distance between the real object and the operation part of the hand is within a predetermined range.
  • the predetermined range here is not particularly limited. However, in the process to be described later, the control unit 14 determines whether the input operation is a touch operation. Therefore, it is preferable that the predetermined range here be small to some extent. For example, the predetermined range may be about several centimeters. The predetermined range may be set in advance or may be set by the user.
  • control unit 14 determines that the distance between the real object and the operation site is within the predetermined range. If the control unit 14 determines that the distance between the real object and the operation site is within the predetermined range, the process proceeds to step S20. Thereafter, the control unit 14 determines the type of input operation based on the state of the hand. When the distance between the real object and the operation part of the hand exceeds the predetermined range, the control unit 14 ends this processing.
  • step S20 the control unit 14 determines the state of the hand based on the detection information.
  • the control unit 14 specifies the state of the right hand RH.
  • step S30 the control unit 14 determines whether the state of the hand is in the touch operation permission state.
  • the touch operation permitting state means the state of the hand when the user is permitted to perform the touch operation.
  • control unit 14 sets any finger included in the hand as a determination target finger.
  • the control unit 14 may include the thumb in the determination target finger.
  • control part 14 may judge that a state of a hand is a touch operation permission state, when a finger tip of a thumb is hiding in a palm.
  • the state where the fingertip of the thumb is hidden in the palm means that the fingertip of the thumb is present at a position facing the palm, for example.
  • the detection unit 13 captures an image of the hand from the back side of the hand, the thumb hides in the palm, so the thumb hardly appears in the captured image.
  • the detection unit 13 captures a hand from the palm side, the thumb present in the palm is captured.
  • the state of RH is the touch operation permission state. That is, the state shown in FIG. 5A is an example of the touch operation permission state.
  • the control unit 14 performs the touch operation on the right hand RH. It may be determined that it is not in the allowable state.
  • FIGS. 5A and 5B it can be said that these figures are different in the state of the thumb.
  • the thumb when the thumb is included in the determination target finger, whether or not the state of the hand is the touch operation permitted state depends on the state of the thumb.
  • the thumb tends to be short and thick compared to other fingers. And regarding this tendency, it is thought that there is little individual difference. Therefore, identification is often easier than with other fingers. Therefore, by including the thumb in the determination target finger, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
  • the control unit 14 determines whether the state of the hand is the touch operation permitted state based on whether the finger tip of the thumb is hidden by the palm. As shown in FIGS. 5A and 5B, the difference in shape between the case where the thumb's finger tip is hidden in the palm and the case where it is not so is clear. Therefore, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
  • control unit 14 determines that the state of the hand is in the touch operation permission state, the control unit 14 determines that the input operation performed by the user is the touch operation, and the process proceeds to step S40.
  • the control unit 14 determines that the state of the hand is not in the touch operation permission state, the control unit 14 determines that the input operation performed by the user is an operation other than the touch operation, and proceeds to step S50.
  • the control unit 14 determines the type of input operation based on the state of the hand.
  • step S40 the control unit 14 receives a touch operation by the user. That is, when the user performs a touch operation (input operation) using the operation part, the control unit 14 performs processing according to the touch operation. For example, as shown in FIG. 6A, when the user moves the fingertip R21 of the forefinger RH2 on the image 12A, the control unit 14 draws the line image 12B on the movement trajectory. Thereafter, the control unit 14 ends the present process.
  • step S50 the control unit 14 does not receive the touch operation.
  • the control unit 14 may receive an input operation other than the touch operation.
  • the control unit 14 may perform a process (for example, a page feed process or the like) according to the hover operation. Thereafter, the control unit 14 ends the present process.
  • the control unit 14 determines the type of the input operation based on the state of the hand when the distance between the real object and the operation part of the hand falls within a predetermined range. . As described above, the control unit 14 determines the type of the input operation in consideration of the distance between the real object and the operation site of the hand and the state of the hand. Therefore, the control unit 14 can more accurately determine the user's input operation. Specifically, the control unit 14 considers the distance between the real object and the operation part of the hand. Regarding the distance between the real object and the operation part of the hand, individual differences are considered to be small.
  • control unit 14 determines the type of input operation by the user based on the state of the hand. Therefore, even in such a case, the control unit 14 can more easily and accurately determine the user's input operation. As a result, the user can perform input operations more accurately.
  • control unit 14 determines that the state of the hand is in the touch operation permission state, it can determine that the input operation is the touch operation. Thereby, the control unit 14 can more easily and accurately determine whether the user has performed a touch operation.
  • control unit 14 includes one or more of any finger and palm included in the hand as the operation site. Therefore, the control unit 14 can more easily and accurately grasp whether or not the operation part approaches the real object.
  • control unit 14 includes an index finger at the operation site.
  • the control unit 14 determines the type of the input operation in consideration of the distance between the index finger and the real object. Therefore, the control unit 14 can more easily and accurately determine the input operation performed by the user.
  • control unit 14 determines any finger included in the hand as the determination target finger, and determines whether the state of the hand is the touch operation permission state based on the state of the determination target finger. Therefore, the control unit 14 can determine the state of the hand in more detail, and in turn, can more easily and accurately determine the type of the input operation.
  • control unit 14 includes the thumb in the determination target finger.
  • the thumb tends to be shorter and thicker than the other fingers. Such a tendency is considered to be small among individuals. For this reason, it is often easier to detect than other fingers. Therefore, by including the thumb in the determination target finger, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
  • control unit 14 determines that the state of the hand is the touch operation permitted state when the fingertip of the thumb is hidden by the palm. As mentioned above, the difference in shape between the case where the thumb's finger tip is hidden in the palm and the case where it is not so is clear. Therefore, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
  • control unit 14 performs control to display an image on a real object. Therefore, the user can more accurately perform the input operation using the image.
  • Second embodiment> Next, a second embodiment of the present disclosure will be described. First, the configuration of the information processing apparatus 20 according to the second embodiment will be described based on FIGS. 7 to 9B.
  • the information processing apparatus 20 includes a storage unit 21, a display unit 22, a detection unit 23, a control unit 24, a frame unit 25, and a housing unit 26.
  • the information processing device 20 is, for example, an eyewear type information processing device. Although the shape of the information processing apparatus 20 is a glasses shape in FIG. 8, it may be a goggle shape or the like.
  • the information processing apparatus 20 has a hardware configuration such as a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a non-volatile memory, a display device, a detection device, a frame, and a housing.
  • the ROM stores information necessary for the operation of the information processing apparatus 20, such as a program.
  • the CPU reads and executes the program stored in the ROM.
  • the display device is configured to correspond to the lens portion of the glasses and is a transmissive display device.
  • the display device displays various images on the display surface. The user can visually recognize the image displayed on the display device and the real object in the real space. That is, the displayed image is a so-called virtual object.
  • the display device may display various images (virtual objects) superimposed on a real object.
  • the detection device is similar to that of the first embodiment.
  • the frame includes a front portion that is a frame of the display device, a temple portion that is hung on the user's ear, and the like.
  • the housing incorporates each hardware configuration and is provided in the temple portion. The installation position of the housing is not limited to this example.
  • the storage unit 21, the display unit 22, the detection unit 23, the control unit 24, the frame unit 25, and the housing unit 26 are realized by these hardware configurations.
  • the information processing apparatus 20 may further include an operation button such as a power button. These operation buttons allow an input operation by the user.
  • the information processing device 20 may include a communication device or the like that can communicate with other information processing devices.
  • the storage unit 21 is configured by, for example, a ROM and stores various types of information.
  • the storage unit 21 stores an image displayed by the display unit 22, a program necessary for processing of the information processing apparatus 20, and the like.
  • the display unit 22 includes, for example, a display device.
  • the display unit 22 superimposes and displays an image on various real objects (for example, the left hand LH shown in FIGS. 9A and 9B). In the example illustrated in FIGS. 9A and 9B, the display unit 22 displays the image 22A superimposed on the real object (left hand LH).
  • the type of real object to be displayed by the display unit 22 is not particularly limited.
  • the display unit 22 may superimpose an image on various walls, signs, signs, and the like.
  • the type of image displayed by the display unit 22 is not particularly limited.
  • the display unit 22 may display an image to be subjected to an input operation by the user.
  • the detection unit 23 includes, for example, a detection device.
  • the function of the detection unit 23 is the same as that of the detection unit 13 of the first embodiment. That is, the detection unit 23 captures, for example, the user's hand and a real object.
  • a real object is an object on which an image is superimposed.
  • the detection unit 23 detects the state and the position of the user's hand (for example, the right hand RH) based on the captured image.
  • the state of the hand includes, for example, the shape of the hand.
  • the position of the hand includes the position of the finger, the palm, and the like.
  • the detection unit 23 also detects the position, the shape, and the like of the real object based on the captured image. Then, the detection unit 23 outputs detection information on the detection result to the control unit 24.
  • the control unit 24 is configured of, for example, a CPU (processor) or the like, and controls each component of the information processing apparatus 20.
  • the control unit 24 causes the display unit 22 to display various images.
  • the control unit 24 recognizes the distance between the real object and the operation site of the hand based on the detection information.
  • the real object is an object on which an image is superimposed, and is, for example, a user's hand or the like.
  • the operation site is a site where the user performs an input operation.
  • the specific content is the same as that of the first embodiment. That is, the control unit 24 includes one or more of any finger and palm included in the hand in the operation site. More specifically, the control unit 24 includes an index finger at the operation site. More specifically, the control unit 24 includes the fingertip of the index finger at the operation site. For example, the control unit 24 includes the fingertip RH21 of the forefinger RH2 of the right hand RH in the operation site.
  • the operation site is not limited to this, and may be the fingertip of another finger, palm, left hand or the like.
  • the operation site may be set in advance or may be set arbitrarily by the user. The modification of the operation part is mentioned later.
  • control unit 24 determines the type of the input operation based on the state of the hand when the distance between the real object and the operation site of the hand falls within a predetermined range. For example, the control unit 24 determines whether the input operation is a touch operation based on the state of the hand. Thus, the control unit 24 can easily and accurately determine whether the user is performing a touch operation. The specific determination method will be described later. And control part 24 performs processing according to input operation.
  • the frame unit 25 is configured of the above-described frame.
  • the housing unit 26 includes, for example, the above-described housing, and incorporates the hardware configuration, in other words, the storage unit 21, the display unit 22, the detection unit 23, and the control unit 24.
  • step S10 the display unit 22 superimposes and displays an image on a real object.
  • the user brings, for example, a forefinger RH2 of the right hand RH close to a real object on which the image is superimposed, in order to perform an input operation on the image.
  • the detection unit 23 captures an image of the user's hand and a real object (a real object on which an image is superimposed).
  • the detection unit 13 captures an image of the user's right hand RH and a real object.
  • the detection unit 23 detects the state and the position of the user's hand based on the captured image. Furthermore, the detection unit 23 also detects the position and the shape of the real object. Next, the detection unit 23 outputs detection information on the detection result to the control unit 24.
  • the control unit 24 specifies the distance between the real object and the operation part of the hand based on the detection information.
  • the control unit 14 specifies the distance between the real object and the fingertip RH21.
  • the control unit 24 determines whether the distance between the real object and the operation site of the hand is within a predetermined range.
  • the predetermined range here is not particularly limited. However, in processing to be described later, the control unit 24 determines whether the input operation is a touch operation. Therefore, it is preferable that the predetermined range here be small to some extent. For example, the predetermined range may be about several centimeters. The predetermined range may be set in advance or may be set by the user.
  • control unit 24 determines that the distance between the real object and the operation site is within the predetermined range. If the control unit 24 determines that the distance between the real object and the operation site is within the predetermined range, the process proceeds to step S20. Thereafter, the control unit 14 determines the type of input operation based on the state of the hand. When the distance between the real object and the operation part of the hand exceeds the predetermined range, the control unit 24 ends this processing.
  • step S20 the control unit 14 determines the state of the hand based on the detection information.
  • the control unit 24 specifies the state of the right hand RH.
  • step S30 the control unit 24 determines whether the state of the hand is in the touch operation permission state.
  • the touch operation permitting state means the state of the hand when the user is permitted to perform the touch operation.
  • control unit 24 sets any finger included in the hand as a determination target finger.
  • the control unit 24 may include the thumb in the determination target finger.
  • control part 24 may judge that a state of a hand is a touch operation permission state, when a finger tip of a thumb is hiding in a palm.
  • the state where the fingertip of the thumb is hidden in the palm means that the fingertip of the thumb is present at a position facing the palm, for example.
  • the detection unit 13 captures an image of the hand from the back side of the hand, the thumb hides in the palm, so the thumb hardly appears in the captured image.
  • the detection unit 13 captures a hand from the palm side, the thumb present in the palm is captured.
  • control unit 24 determines that the state of the hand is in the touch operation permission state, the control unit 24 determines that the input operation performed by the user is a touch operation, and the process proceeds to step S40.
  • the control unit 24 determines that the state of the hand is not in the touch operation permission state, the control unit 24 determines that the input operation performed by the user is an operation other than the touch operation, and proceeds to step S50.
  • the control unit 24 determines the type of input operation based on the state of the hand.
  • step S40 the control unit 24 receives a touch operation by the user. That is, when the user performs a touch operation using the operation site, the control unit 24 performs a process according to the touch operation. For example, as shown in FIG. 9A, when the user moves the fingertip R21 of the forefinger RH2 on the image 22A, the control unit 24 draws a line image 22B on the movement trajectory. Thereafter, the control unit 24 ends the present process.
  • step S50 the control unit 24 does not receive the touch operation.
  • the control unit 24 may receive an input operation other than the touch operation.
  • the control unit 24 may perform a process (for example, a page feed process or the like) according to the hover operation. Thereafter, the control unit 24 ends the present process.
  • the same effect as that of the first embodiment can be obtained. Furthermore, according to the second embodiment, even in the case of superimposing a virtual object on a real object, the input operation performed by the user can be determined more easily and accurately.
  • the virtual object is superimposed on the real object, and the distance between the real object and the operation site is measured.
  • the second embodiment is not limited to this, and for example, only virtual objects may be arranged in the real space. That is, the display unit 22 may display the virtual object at a position not overlapping the real object.
  • the control unit 24 sets the position of the virtual object in the real space. Then, the control unit 24 may perform the above-described processing using the distance between the setting position of the virtual object and the operation site. In this case, since the control unit 24 does not need to superimpose the virtual object on the real object, the control unit 24 can provide more various virtual objects to the user. Furthermore, the control unit 24 can accurately determine the input operation on these virtual objects.
  • the index finger RH2 is the operation site, and the thumb RH1 touches any one of the middle finger RH3, ring finger RH4 and little finger RH5 (in this example, middle finger RH3).
  • the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is the touch operation permission state.
  • the index finger RH2 is the operation site, and the thumb RH1 is separated from the middle finger RH3, the ring finger RH4, and the little finger RH5.
  • the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is not the touch operation permission state.
  • the control units 14 and 24 determine whether the state of the hand is the touch operation permitted state or not depending on whether the finger of the thumb touches the index finger. Good. Specifically, when the fingertip of the thumb touches the forefinger, the control units 14 and 24 may determine that the state of the hand is the touch operation permitted state. The control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the fingertip of the thumb does not touch the forefinger. Also in this case, the same effect as described above can be obtained. The same effects as those of the first and second embodiments can be obtained by the first modification as well.
  • the thumb is used as the determination target finger, but another finger may be used as the determination target finger.
  • the control units 14 and 24 may include one or more fingers among the middle finger, the ring finger, and the little finger in the determination target finger. Then, the control units 14 and 24 may determine whether or not the state of the hand is in the touch operation permission state based on whether the fingertip of the determination target finger is hidden in the palm. Specifically, when the fingertip of the determination target finger is hidden by the palm, the control units 14 and 24 may determine that the state of the hand is the touch operation permission state.
  • the control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the fingertip of the determination target finger is separated from the palm.
  • one or more of the middle finger RH3, the ring finger RH4, and the little finger RH5 are the determination target fingers.
  • the control units 14 and 24 determine that the state of the hand (the state of the right hand RH) is not the touch operation permission state .
  • the control unit 14 or 24 controls the state of the hand in the touch operation permitted state depending on whether the middle finger is touching the index finger. It may be determined whether or not. Specifically, when the middle finger touches the forefinger, the control units 14 and 24 may determine that the state of the hand is the touch operation permission state. The control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the middle finger does not touch the index finger. In the example shown in FIG. 12, the middle finger RH3 touches the forefinger RH2. In this case, the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is the touch operation permission state.
  • the same effect as in the first and second embodiments can be obtained.
  • the thumb is used as the determination target finger, it may be possible to easily determine whether the state of the hand is the touch operation permitted state.
  • the index finger is used as the operation part, but another part may be used as the operation part.
  • the little finger RH5 of the right hand RH may be used as the operation site
  • the thumb RH1 of the right hand RH may be used as the operation site.
  • the control units 14 and 24 control the state of the hand (the state of the right hand RH) based on the state of the finger other than the operation site (thumb RH1 in the example of FIG. It may be determined whether or not the operation is permitted. Also in the third modification, the same effect as in the first and second embodiments can be obtained.
  • step S30 the control units 14 and 24 satisfy one or more of the conditions described in the first and second embodiments and the conditions described in the first to third modifications. In this case, it may be determined that the state of the hand is the touch operation permission state.
  • An information processing apparatus comprising: a control unit that determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
  • the control unit determines that the input operation is a touch operation when the control unit determines that the state of the hand is in a touch operation permission state.
  • the control unit includes, in the operation portion, any one or more of any finger and palm included in the hand.
  • the control unit includes an index finger in the operation portion.
  • the control unit determines any finger included in the hand as a determination target finger, and determines whether the state of the hand is the touch operation permitted state based on the state of the determination target finger.
  • the control unit determines that the state of the hand is the touch operation permitted state when the index finger is included in the operation portion and the fingertip of the thumb touches a finger other than the index finger. Information processor as described.
  • the control unit determines that the state of the hand is the touch operation permitted state when the index finger is included in the operation portion and the fingertip of the thumb touches the index finger.
  • the information processing apparatus according to any one of the above.
  • the information processing apparatus according to any one of (5) to (9), wherein the control unit includes, in the determination target finger, at least one finger of the middle finger, the ring finger, and the little finger.
  • the control unit determines that the state of the hand is the touch operation permission state when the fingertip of the determination target finger is hidden by a palm.
  • the control unit includes a forefinger in the operation portion, includes the middle finger in the determination target finger, and the touch operation is permitted when the middle finger touches the forefinger.
  • the information processing apparatus according to (10) or (11), which determines.
  • the control unit determines the type of the input operation based on the state of the hand, when the distance between the real object present in the real space and the operation portion falls within a predetermined range.
  • the information processing apparatus according to any one of (12).
  • the information processing apparatus according to (13), wherein the control unit performs control to display an image to be an object of the input operation on the real object.
  • the control unit determines whether the state of the hand is in the touch operation permission state when the virtual object is disposed in the real space and the distance between the virtual object and the operation site falls within a predetermined range.
  • the information processing apparatus according to any one of (1) to (15).
  • On the computer A program that implements a control function of determining the type of input operation based on the state of the hand when the distance between the object and the operation part of the hand falls within a predetermined range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

[Problem] The present disclosure proposes a novel, improved information processing device, etc., capable of more accurately assessing an input operation carried out by a user using a hand. [Solution] The present disclosure provides an information processing device comprising a control unit which, when the distance between an object and a part of a hand carrying out an operation is within a prescribed range, assesses the type of input operation on the basis of the state of the hand. In the present disclosure, the control unit, etc. references the distance between an object and a part of a hand carrying out an operation and the state of the hand in order to assess the type of input operation.

Description

情報処理装置、情報処理方法、及びプログラムINFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
 本開示は、情報処理装置、情報処理方法、及びプログラムに関する。 The present disclosure relates to an information processing device, an information processing method, and a program.
 特許文献1には、ユーザが手を用いたジェスチャ操作、すなわち入力操作を行った場合に、指間の角度に基づいた操作コマンドを機器に送信する操作装置が開示されている。 Patent Document 1 discloses an operation device that transmits an operation command based on an angle between fingers to a device when the user performs a gesture operation using a hand, that is, an input operation.
特開2014-182662号公報JP, 2014-182663, A
 しかし、特許文献1に開示された技術では、ユーザの意図しない操作コマンドが機器に送信される場合があった。つまり、ユーザが行った入力操作を誤って認識する可能性があった。 However, in the technique disclosed in Patent Document 1, there is a case where an operation command not intended by the user is transmitted to the device. That is, there was a possibility that the input operation performed by the user could be erroneously recognized.
 そこで、本開示では、ユーザが手を用いて行った入力操作をより正確に判定することが可能な、新規かつ改良された情報処理装置、情報処理方法、及びプログラムを提案する。 Therefore, the present disclosure proposes a new and improved information processing apparatus, information processing method, and program that can more accurately determine an input operation performed by a user using a hand.
 本開示によれば、オブジェクトと手の操作部位との距離が所定範囲内となる場合に、手の状態に基づいて、入力操作の種類を判定する制御部を備える、情報処理装置が提供される。 According to the present disclosure, an information processing apparatus is provided that includes a control unit that determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range. .
 本開示によれば、プロセッサが、オブジェクトと手の操作部位との距離が所定範囲内となる場合に、手の状態に基づいて、入力操作の種類を判定する、情報処理方法が提供される。 According to the present disclosure, an information processing method is provided in which the processor determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
 本開示によれば、コンピュータに、オブジェクトと手の操作部位との距離が所定範囲内となる場合に、手の状態に基づいて、入力操作の種類を判定する制御機能を実現させる、プログラムが提供される。 According to the present disclosure, a program is provided that causes a computer to realize a control function of determining the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range. Be done.
 本開示によれば、制御部等は、オブジェクトと手の操作部位との距離が所定範囲内となる場合に、手の状態に基づいて、入力操作の種類を判定する。 According to the present disclosure, the control unit or the like determines the type of the input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
 以上説明したように本開示によれば、制御部等は、オブジェクトと手の操作部位との距離及び手の状態を考慮して入力操作の種類を判定する。したがって、制御部等は、ユーザの入力操作をより正確に判定することができる。なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 As described above, according to the present disclosure, the control unit or the like determines the type of the input operation in consideration of the distance between the object and the operation site of the hand and the state of the hand. Therefore, the control unit or the like can more accurately determine the input operation of the user. Note that the above-mentioned effects are not necessarily limited, and, along with or in place of the above-mentioned effects, any of the effects shown in the present specification, or other effects that can be grasped from the present specification May be played.
本開示の第1の実施形態に係る情報処理装置の機能ブロック図である。It is a functional block diagram of an information processor concerning a 1st embodiment of this indication. 情報処理装置の外観例を示す説明図である。It is an explanatory view showing the appearance example of an information processor. 情報処理装置の外観例を示す説明図である。It is an explanatory view showing the appearance example of an information processor. 情報処理装置による処理の手順を示すフローチャートである。It is a flowchart which shows the procedure of the process by an information processing apparatus. 手の状態の一例を示す説明図である。It is an explanatory view showing an example of a state of a hand. 手の状態の一例を示す説明図である。It is an explanatory view showing an example of a state of a hand. 入力操作の一例を示す説明図である。It is an explanatory view showing an example of input operation. 入力操作の一例を示す説明図である。It is an explanatory view showing an example of input operation. 本開示の第2の実施形態に係る情報処理装置の機能ブロック図である。It is a functional block diagram of an information processor concerning a 2nd embodiment of this indication. 情報処理装置の外観例を示す斜視図である。It is a perspective view showing an example of appearance of an information processor. 入力操作の一例を示す説明図である。It is an explanatory view showing an example of input operation. 入力操作の一例を示す説明図である。It is an explanatory view showing an example of input operation. 手の状態の一例を示す説明図である。It is an explanatory view showing an example of a state of a hand. 手の状態の一例を示す説明図である。It is an explanatory view showing an example of a state of a hand. 手の状態の一例を示す説明図である。It is an explanatory view showing an example of a state of a hand. 手の状態の一例を示す説明図である。It is an explanatory view showing an example of a state of a hand. 手の状態の一例を示す説明図である。It is an explanatory view showing an example of a state of a hand. 手の状態の一例を示す説明図である。It is an explanatory view showing an example of a state of a hand.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration will be assigned the same reference numerals and redundant description will be omitted.
 なお、説明は以下の順序で行うものとする。
 1.本発明者による検討
 2.第1の実施形態(投影型情報処理装置の例)
  2-1.情報処理装置の構成
  2-2.情報処理装置による処理
 3.第2の実施形態(アイウェア型情報処理装置の例)
  3-1.情報処理装置の構成
  3-2.情報処理装置による処理
 4.各種変形例
  4-1.第1の変形例
  4-2.第2の変形例
  4-3.第3の変形例
The description will be made in the following order.
1. Examination by the inventor 2. First embodiment (example of projection type information processing apparatus)
2-1. Configuration of Information Processing Device 2-2. Processing by information processing apparatus Second embodiment (example of eyewear type information processing apparatus)
3-1. Configuration of information processing apparatus 3-2. Processing by information processing apparatus Various Modifications 4-1. First Modified Example 4-2. Second Modified Example 4-3. Third modification
 <1.本発明者による検討>
 本発明者は、特許文献1に開示された技術を鋭意検討することで、以下に説明する各実施形態に想到した。そこで、まず、本発明者が行った検討について説明する。上述したように、特許文献1に開示された技術では、ユーザがジェスチャ操作を行った場合に、指間の角度に基づいた操作コマンドを機器に送信する。つまり、この技術では、指間の角度を検知する。しかし、手の形状は個人差が大きい。このため、複数のユーザが手で同じジェスチャ操作(例えばピースサイン)を行った場合に、指間の角度がユーザ毎にばらつくことがある。したがって、指間の角度を検知しただけでは、ユーザが行ったジェスチャ操作、すなわち入力操作を誤って認識する可能性がある。
<1. Examination by the inventor>
The present inventor has considered each embodiment described below by earnestly examining the technology disclosed in Patent Document 1. Therefore, first, the study conducted by the inventor will be described. As described above, in the technology disclosed in Patent Document 1, when the user performs a gesture operation, an operation command based on the angle between the fingers is transmitted to the device. That is, this technique detects the angle between the fingers. However, the shape of the hand is large among individuals. For this reason, when a plurality of users perform the same gesture operation (for example, piece sign) by hand, the angle between the fingers may vary for each user. Therefore, merely detecting the angle between the fingers may erroneously recognize the gesture operation performed by the user, that is, the input operation.
 そこで、本発明者は、ユーザが手を用いて行ったジェスチャ操作、すなわち入力操作をより正確に判定することが可能な技術について鋭意検討した。まず、本発明者は、実オブジェクトと手の操作部位との距離に基づいて入力操作の種類を判定する技術について検討した。ここで、実オブジェクトは、ユーザによる入力操作が行われる部分である。詳細は後述するが、実オブジェクト上にユーザによる入力操作が可能な画像が表示されうる。操作部位は、ユーザが入力操作を行う部位であり、例えば人差し指の指先である。 Therefore, the present inventor diligently studied a technique capable of more accurately determining the gesture operation performed by the user using a hand, that is, the input operation. First, the inventor examined a technique for determining the type of input operation based on the distance between a real object and a hand operation part. Here, the real object is a portion where the user performs an input operation. Although details will be described later, an image on which the user can perform an input operation can be displayed on the real object. The operation site is a site where the user performs an input operation, and is, for example, a fingertip of the index finger.
 具体的には、本発明者は、実オブジェクトに手の操作部位が接触した場合に、入力操作がタッチ操作であると判定する技術について検討した。距離に関しては、個人差の影響が小さいと言えるからである。つまり、ユーザは、操作対象に対して操作を行いたい場合、手の操作部位を操作対象に近づけると想定される。 Specifically, the inventor examined a technique for determining that the input operation is a touch operation when the operation site of the hand contacts the real object. This is because it can be said that the influence of individual differences is small with regard to distance. That is, when the user wants to operate the operation target, it is assumed that the operation site of the hand approaches the operation target.
 しかし、この方法では、実オブジェクトと手の操作部位との距離を検知するセンサの精度によっては、実オブジェクトに手の操作部位が接触しているか否かを判定することが難しくなる可能性がある。そこで、本発明者は、さらなる検討を重ねた。この結果、本発明者は、オブジェクトと手の操作部位との距離及び手の状態に着目し、これらの情報を用いることで、ユーザの入力操作をより正確に判定することができることを見出した。そして、本発明者は、このような知見に基づいて、本実施形態に想到した。以下、本実施形態について詳細に説明する。 However, in this method, it may be difficult to determine whether or not the hand operation site is in contact with the real object, depending on the accuracy of the sensor that detects the distance between the real object and the operation site of the hand. . Therefore, the present inventor has made further studies. As a result, the inventor has found that the user's input operation can be determined more accurately by focusing on the distance between the object and the operation part of the hand and the state of the hand and using these pieces of information. And the present inventors considered to this embodiment based on such knowledge. Hereinafter, the present embodiment will be described in detail.
 <2.第1の実施形態>
 (2-1.情報処理装置の構成)
 つぎに、本開示の第1の実施形態について説明する。まず、図1~図3に基づいて、第1の実施形態に係る情報処理装置10の構成について説明する。
<2. First embodiment>
(2-1. Configuration of Information Processing Apparatus)
Next, a first embodiment of the present disclosure will be described. First, the configuration of the information processing apparatus 10 according to the first embodiment will be described based on FIGS. 1 to 3.
 情報処理装置10は、記憶部11、表示部12、検知部13、制御部14、及び筐体部15を備える。情報処理装置10は、例えば投影装置である。情報処理装置10は、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)、不揮発性メモリ、表示装置、検知装置、及び筐体等のハードウェア構成を有する。 The information processing apparatus 10 includes a storage unit 11, a display unit 12, a detection unit 13, a control unit 14, and a housing unit 15. The information processing apparatus 10 is, for example, a projection apparatus. The information processing apparatus 10 has a hardware configuration such as a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a non-volatile memory, a display device, a detection device, and a housing.
 ROMには、情報処理装置10の動作に必要な情報、例えばプログラム等が記録されている。CPUは、ROMに記録されているプログラムを読み出して実行する。表示装置は、画像を各種実オブジェクト(例えばテーブル100、壁200等)に投影することで、これらの実オブジェクト上に画像を表示する。検知装置は、例えばユーザの手及び実オブジェクトを撮像する。そして、検知装置は、撮像画像に基づいて、ユーザ(図2及び図3に示す人物U)の手の状態(例えば形状)及び位置を検知する。さらに、検知装置は、実オブジェクトの位置、形状なども検知する。検知装置は、例えばデプスセンサであるが、上記の機能を実現できる装置であればこれに限定されない。例えば、検知装置は、手の3次元特徴点位置を特定できる装置であってもよい。筐体は、各ハードウェア構成を内蔵するものである。これらのハードウェア構成により、記憶部11、表示部12、検知部13、制御部14、及び筐体部15が実現される。なお、情報処理装置10は、パワーボタン等の操作ボタンをさらに備えていても良い。これらの操作ボタンは、ユーザによる入力操作が可能なものである。さらに、情報処理装置10は、他の情報処理装置と通信可能な通信装置等を備えていても良い。 The ROM stores information necessary for the operation of the information processing apparatus 10, such as a program. The CPU reads and executes the program stored in the ROM. The display device projects an image on various real objects (for example, the table 100, the wall 200, and the like) to display the images on these real objects. The detection device captures, for example, the user's hand and a real object. And a detection apparatus detects the state (for example, shape) and position of a hand of a user (person U shown in FIG.2 and FIG.3) based on a captured image. Furthermore, the detection device also detects the position, shape, etc. of the real object. Although a detection apparatus is a depth sensor, for example, if it is an apparatus which can realize the above-mentioned function, it will not be limited to this. For example, the detection device may be a device capable of specifying the position of the three-dimensional feature point of the hand. The housing incorporates each hardware configuration. The storage unit 11, the display unit 12, the detection unit 13, the control unit 14, and the housing unit 15 are realized by these hardware configurations. The information processing apparatus 10 may further include an operation button such as a power button. These operation buttons allow an input operation by the user. Furthermore, the information processing device 10 may include a communication device or the like that can communicate with other information processing devices.
 記憶部11は、例えばROM等で構成され、各種の情報を記憶する。例えば、記憶部11は、表示部12によって表示される画像、情報処理装置10の処理に必要なプログラム等を記憶する。 The storage unit 11 is configured by, for example, a ROM and stores various types of information. For example, the storage unit 11 stores an image displayed by the display unit 12, a program necessary for the processing of the information processing apparatus 10, and the like.
 表示部12は、例えば表示装置等で構成される。表示部12は、画像を各種実オブジェクト(例えばテーブル100、壁200等)に投影することで、画像を実オブジェクト上に表示する。図2、図3に示す例では、表示部12は、画像12Aを実オブジェクト上に表示している。表示部12による画像の表示対象となる実オブジェクトの種類は特に問われない。また、表示部12が表示する画像の種類も特に問われない。例えば、表示部12は、ユーザによる入力操作の対象となる画像が表示されてもよい。 The display unit 12 includes, for example, a display device. The display unit 12 displays an image on a real object by projecting the image on various real objects (for example, the table 100, the wall 200, and the like). In the example shown in FIGS. 2 and 3, the display unit 12 displays the image 12A on the real object. The type of real object to be displayed by the display unit 12 is not particularly limited. Further, the type of image displayed by the display unit 12 is not particularly limited. For example, the display unit 12 may display an image to be subjected to an input operation by the user.
 検知部13は、例えば検知装置等で構成される。検知部13は、例えばユーザの手及び実オブジェクトを撮像する。そして、検知部13は、撮像画像に基づいて、ユーザの手(例えば、右手RH)の状態、及び位置を検知する。ここで、手の状態には、例えば手の形状が含まれる。手の位置には、指、手のひらの位置等が含まれる。さらに、検知部13は、撮像画像に基づいて、実オブジェクトの位置、形状等も検知する。そして、検知部13は、検知結果に関する検知情報を制御部14に出力する。 The detection unit 13 includes, for example, a detection device. The detection unit 13 captures, for example, the user's hand and a real object. Then, the detection unit 13 detects the state and the position of the user's hand (for example, the right hand RH) based on the captured image. Here, the state of the hand includes, for example, the shape of the hand. The position of the hand includes the position of the finger, the palm, and the like. Furthermore, the detection unit 13 also detects the position, shape, and the like of the real object based on the captured image. Then, the detection unit 13 outputs detection information on the detection result to the control unit 14.
 制御部14は、例えばCPU(プロセッサ)等で構成され、情報処理装置10の各構成要素を制御する。例えば、制御部14は、表示部12に各種画像を表示させる。さらに、制御部14は、検知情報に基づいて、実オブジェクトと手の操作部位との距離を認識する。ここで、実オブジェクトは、表示部12から画像が投影されるオブジェクトであり、例えばテーブル100、壁200等である。 The control unit 14 is configured of, for example, a CPU (processor) or the like, and controls each component of the information processing apparatus 10. For example, the control unit 14 causes the display unit 12 to display various images. Furthermore, the control unit 14 recognizes the distance between the real object and the operation site of the hand based on the detection information. Here, the real object is an object on which an image is projected from the display unit 12 and is, for example, a table 100, a wall 200, and the like.
 操作部位は、ユーザが入力操作を行う部位である。制御部14は、手に含まれるいずれかの指及び手のひらのうち、いずれか1種以上を前記操作部位に含める。より具体的には、制御部14は、人差し指を操作部位に含める。より具体的には、制御部14は、人差し指の指先を操作部位に含める。指先は、例えば第1関節より先の部位である。例えば、制御部14は、右手RHの人差し指RH2の指先RH21を操作部位に含める。もちろん、操作部位はこの限りではなく、他の指の指先、手のひら、左手等であってもよい。操作部位は、予め設定されていても良いし、ユーザが任意に設定しても良い。操作部位の変形例については後述する。 The operation site is a site where the user performs an input operation. The control unit 14 includes one or more of any finger and palm included in the hand in the operation site. More specifically, the control unit 14 includes the index finger at the operation site. More specifically, the control unit 14 includes the fingertip of the index finger at the operation site. The fingertip is, for example, a site beyond the first joint. For example, the control unit 14 includes the fingertip RH21 of the forefinger RH2 of the right hand RH in the operation site. Of course, the operation site is not limited to this, and may be the fingertip of another finger, palm, left hand or the like. The operation site may be set in advance or may be set arbitrarily by the user. The modification of the operation part is mentioned later.
 そして、制御部14は、実オブジェクトと手の操作部位との距離が所定範囲内となる場合に、手の状態に基づいて、入力操作の種類を判定する。例えば、制御部14は、手の状態に基づいて、入力操作がタッチ操作であるか否かを判定する。これにより、制御部14は、ユーザがタッチ操作を行っているか否かを容易かつ正確に判定することができる。ここで、タッチ操作は、実オブジェクト、言い換えれば実オブジェクト上に表示された画像にタッチして行う各種の操作である。タッチ操作の例としては、選択操作(タッチした部分を選択する操作)、描画操作(タッチした操作部位を移動させることで、移動軌跡上に線を描く操作)等が挙げられる。具体的な判定方法については後述する。そして、制御部14は、入力操作に応じた処理を行う。筐体部15は、例えば上述した筐体で構成され、ハードウェア構成、言い換えれば、記憶部11、表示部12、検知部13、及び制御部14を内蔵する。 Then, when the distance between the real object and the operation site of the hand falls within a predetermined range, the control unit 14 determines the type of the input operation based on the state of the hand. For example, the control unit 14 determines whether the input operation is a touch operation based on the state of the hand. Thus, the control unit 14 can easily and accurately determine whether the user is performing a touch operation. Here, the touch operation is various operations performed by touching a real object, in other words, an image displayed on the real object. Examples of the touch operation include a selection operation (an operation of selecting a touched portion), a drawing operation (an operation of drawing a line on a movement locus by moving the touched operation portion), and the like. The specific determination method will be described later. And control part 14 performs processing according to input operation. The housing unit 15 includes, for example, the above-described housing, and incorporates the hardware configuration, in other words, the storage unit 11, the display unit 12, the detection unit 13, and the control unit 14.
 (2-2.情報処理装置による処理)
 つぎに、情報処理装置10による処理の手順を図4に示すフローチャートに沿って説明する。
(2-2. Processing by the information processing apparatus)
Next, the procedure of processing by the information processing apparatus 10 will be described along the flowchart shown in FIG.
 ステップS10において、表示部12は、画像を実オブジェクト上に表示する。ユーザは、画像に対して入力操作を行うために、例えば右手RHの人差し指RH2を画像に近づける。検知部13は、ユーザの手及び実オブジェクト(画像が表示された実オブジェクト)を撮像する。ユーザが右手RHを画像に近づけた場合、検知部13は、ユーザの右手RH及び実オブジェクトを撮像する。 In step S10, the display unit 12 displays an image on a real object. The user brings the forefinger RH2 of the right hand RH close to the image, for example, to perform an input operation on the image. The detection unit 13 captures an image of the user's hand and a real object (a real object on which an image is displayed). When the user brings the right hand RH close to the image, the detection unit 13 captures an image of the user's right hand RH and a real object.
 ついで、検知部13は、撮像画像に基づいて、ユーザの手の状態及び位置を検知する。さらに、検知部13は、実オブジェクトの位置及び形状等も検知する。ついで、検知部13は、検知結果に関する検知情報を制御部14に出力する。 Next, the detection unit 13 detects the state and the position of the user's hand based on the captured image. Furthermore, the detection unit 13 also detects the position and shape of the real object. Subsequently, the detection unit 13 outputs detection information on the detection result to the control unit 14.
 制御部14は、検知情報に基づいて、実オブジェクトと手の操作部位までの距離を特定する。操作部位が右手RHの人差し指RH2の指先RH21となっている場合、制御部14は、実オブジェクトと指先RH21との距離を特定する。 The control unit 14 specifies the distance between the real object and the operation part of the hand based on the detection information. When the operation site is the fingertip RH21 of the forefinger RH2 of the right hand RH, the control unit 14 specifies the distance between the real object and the fingertip RH21.
 ついで、制御部14は、実オブジェクトと手の操作部位との距離が所定範囲内であるか否かを判定する。ここでの所定範囲は特に制限されない。ただし、後述する処理において、制御部14は、入力操作がタッチ操作であるか否かを判定する。したがって、ここでの所定範囲は、ある程度小さいことが好ましい。例えば、所定範囲は、数センチ程度であってもよい。所定範囲は、予め設定されていても良いし、ユーザによって設定されても良い。 Next, the control unit 14 determines whether the distance between the real object and the operation part of the hand is within a predetermined range. The predetermined range here is not particularly limited. However, in the process to be described later, the control unit 14 determines whether the input operation is a touch operation. Therefore, it is preferable that the predetermined range here be small to some extent. For example, the predetermined range may be about several centimeters. The predetermined range may be set in advance or may be set by the user.
 制御部14は、実オブジェクトと操作部位との距離が所定範囲内であると判定した場合には、ステップS20に進む。その後、制御部14は、手の状態に基づいて、入力操作の種類を判定する。制御部14は、実オブジェクトと手の操作部位との距離が所定範囲を超えている場合には、本処理を終了する。 If the control unit 14 determines that the distance between the real object and the operation site is within the predetermined range, the process proceeds to step S20. Thereafter, the control unit 14 determines the type of input operation based on the state of the hand. When the distance between the real object and the operation part of the hand exceeds the predetermined range, the control unit 14 ends this processing.
 ステップS20において、制御部14は、検知情報に基づいて、手の状態を判定する。操作部位が右手RHに含まれる場合、制御部14は、右手RHの状態を特定する。 In step S20, the control unit 14 determines the state of the hand based on the detection information. When the operation site is included in the right hand RH, the control unit 14 specifies the state of the right hand RH.
 ステップS30において、制御部14は、手の状態がタッチ操作許容状態になっているか否かを判定する。ここで、タッチ操作許容状態は、ユーザにタッチ操作を許容させる際の手の状態を意味する。 In step S30, the control unit 14 determines whether the state of the hand is in the touch operation permission state. Here, the touch operation permitting state means the state of the hand when the user is permitted to perform the touch operation.
 具体的には、制御部14は、手に含まれるいずれかの指を判定対象指とする。制御部14は、親指を判定対象指に含めてもよい。そして、制御部14は、親指の指先が手のひらに隠れている場合、手の状態がタッチ操作許容状態であると判定してもよい。ここで、親指の指先が手のひらに隠れている状態とは、例えば、親指の指先が手のひらに対向する位置に存在することを意味する。検知部13が手の甲側から手を撮像した場合、親指は手のひらに隠れるため、撮像画像に親指はほとんど映らない。一方、検知部13が手のひら側から手を撮像した場合、手のひら内に存在する親指が撮像される。 Specifically, the control unit 14 sets any finger included in the hand as a determination target finger. The control unit 14 may include the thumb in the determination target finger. And control part 14 may judge that a state of a hand is a touch operation permission state, when a finger tip of a thumb is hiding in a palm. Here, the state where the fingertip of the thumb is hidden in the palm means that the fingertip of the thumb is present at a position facing the palm, for example. When the detection unit 13 captures an image of the hand from the back side of the hand, the thumb hides in the palm, so the thumb hardly appears in the captured image. On the other hand, when the detection unit 13 captures a hand from the palm side, the thumb present in the palm is captured.
 例えば、制御部14は、図5Aに示すように、右手RHの親指RH1の指先RH11が手のひらに隠れている場合、すなわち、親指RH1の指先RH11が手のひらに対向する位置に存在する場合に、右手RHの状態がタッチ操作許容状態であると判定しても良い。つまり、図5Aに示す状態がタッチ操作許容状態の一例となる。一方、制御部14は、図5Bに示すように、右手RHの親指RH1の指先RH11が手のひらに隠れていない場合、すなわち、指先R11が手のひらの外に存在する場合、右手RHの状態がタッチ操作許容状態でないと判定しても良い。図5A、図5Bを比較すると、これらの図は親指の状態が異なっていると言える。このように、親指を判定対象指に含めた場合、手の状態がタッチ操作許容状態であるか否かは親指の状態によって決まる。ここで、親指は、他の指に比べて短い、太いと言った傾向がある。そして、この傾向に関しては、個人差が少ないと考えられる。したがって、他の指に比べて特定が容易であることが多い。したがって、制御部14は、親指を判定対象指に含めることで、手の状態がタッチ操作許容状態であるか否かをより容易かつ正確に判定することができる。 For example, as shown in FIG. 5A, when the fingertip RH11 of the thumb RH1 of the right hand RH is hidden in the palm as shown in FIG. 5A, that is, when the fingertip RH11 of the thumb RH1 is at a position facing the palm, It may be determined that the state of RH is the touch operation permission state. That is, the state shown in FIG. 5A is an example of the touch operation permission state. On the other hand, as shown in FIG. 5B, when the fingertip RH11 of the thumb RH1 of the right hand RH is not hidden by the palm, that is, when the fingertip R11 is outside the palm, the control unit 14 performs the touch operation on the right hand RH. It may be determined that it is not in the allowable state. Comparing FIGS. 5A and 5B, it can be said that these figures are different in the state of the thumb. As described above, when the thumb is included in the determination target finger, whether or not the state of the hand is the touch operation permitted state depends on the state of the thumb. Here, the thumb tends to be short and thick compared to other fingers. And regarding this tendency, it is thought that there is little individual difference. Therefore, identification is often easier than with other fingers. Therefore, by including the thumb in the determination target finger, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
 さらに、上記の例では、制御部14は、親指の指先が手のひらに隠れているか否かによって、手の状態がタッチ操作許容状態であるか否かを判定する。図5A、図5Bに示すように、親指の指先が手のひらに隠れている場合とそうでない場合との形状の違いは明瞭である。したがって、制御部14は、手の状態がタッチ操作許容状態であるか否かをより容易かつ正確に判定することができる。 Furthermore, in the above-described example, the control unit 14 determines whether the state of the hand is the touch operation permitted state based on whether the finger tip of the thumb is hidden by the palm. As shown in FIGS. 5A and 5B, the difference in shape between the case where the thumb's finger tip is hidden in the palm and the case where it is not so is clear. Therefore, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
 制御部14は、手の状態がタッチ操作許容状態になっていると判定した場合、ユーザが行う入力操作はタッチ操作であると判定し、ステップS40に進む。一方、制御部14は、手の状態がタッチ操作許容状態になっていないと判定した場合、ユーザが行う入力操作はタッチ操作以外の操作であると判定し、ステップS50に進む。このように、制御部14は、手の状態に基づいて、入力操作の種類を判定する。 If the control unit 14 determines that the state of the hand is in the touch operation permission state, the control unit 14 determines that the input operation performed by the user is the touch operation, and the process proceeds to step S40. On the other hand, when the control unit 14 determines that the state of the hand is not in the touch operation permission state, the control unit 14 determines that the input operation performed by the user is an operation other than the touch operation, and proceeds to step S50. Thus, the control unit 14 determines the type of input operation based on the state of the hand.
 ステップS40において、制御部14は、ユーザによるタッチ操作を受け付ける。すなわち、制御部14は、ユーザが操作部位を用いてタッチ操作(入力操作)を行った場合、当該タッチ操作に応じた処理を行う。例えば、図6Aに示すように、ユーザが人差し指RH2の指先R21を画像12A上で移動させた場合、制御部14は、移動軌跡上に線画像12Bを描く。その後、制御部14は、本処理を終了する。 In step S40, the control unit 14 receives a touch operation by the user. That is, when the user performs a touch operation (input operation) using the operation part, the control unit 14 performs processing according to the touch operation. For example, as shown in FIG. 6A, when the user moves the fingertip R21 of the forefinger RH2 on the image 12A, the control unit 14 draws the line image 12B on the movement trajectory. Thereafter, the control unit 14 ends the present process.
 一方、ステップS50において、制御部14は、タッチ操作を受け付けない。例えば、図6Bに示すように、ユーザが人差し指RH2の指先R21を画像12A上で移動させた場合であっても、制御部14は、これに応じた処理を行わない。その一方で、制御部14は、タッチ操作以外の入力操作を受け付けてもよい。例えば、制御部14は、ユーザによるホバー操作(画像上で手を移動させる入力操作)を行った場合に、当該ホバー操作に応じた処理(例えば、ページ送り処理等)を行っても良い。その後、制御部14は、本処理を終了する。 On the other hand, in step S50, the control unit 14 does not receive the touch operation. For example, as shown in FIG. 6B, even when the user moves the fingertip R21 of the forefinger RH2 on the image 12A, the control unit 14 does not perform processing according to the movement. On the other hand, the control unit 14 may receive an input operation other than the touch operation. For example, when the user performs a hover operation (an input operation for moving a hand on an image), the control unit 14 may perform a process (for example, a page feed process or the like) according to the hover operation. Thereafter, the control unit 14 ends the present process.
 以上により、第1の実施形態によれば、制御部14は、実オブジェクトと手の操作部位との距離が所定範囲内となる場合に、手の状態に基づいて、入力操作の種類を判定する。このように、制御部14は、実オブジェクトと手の操作部位との距離及び手の状態を考慮して入力操作の種類を判定する。したがって、制御部14は、ユーザの入力操作をより正確に判定することができる。具体的には、制御部14は、実オブジェクトと手の操作部位との距離を考慮する。実オブジェクトと手の操作部位との距離に関しては、個人差が少ないと考えられる。ただし、検知部13の精度によっては、実オブジェクトと手の操作部位とが接触しているか否かの判定が難しい場合がありうる。そこで、制御部14は、手の状態に基づいてユーザの入力操作の種類を判定する。したがって、このような場合であっても、制御部14は、ユーザの入力操作をより容易かつ正確に判定することができる。結果として、ユーザは、入力操作をより正確に行うことができる。 As described above, according to the first embodiment, the control unit 14 determines the type of the input operation based on the state of the hand when the distance between the real object and the operation part of the hand falls within a predetermined range. . As described above, the control unit 14 determines the type of the input operation in consideration of the distance between the real object and the operation site of the hand and the state of the hand. Therefore, the control unit 14 can more accurately determine the user's input operation. Specifically, the control unit 14 considers the distance between the real object and the operation part of the hand. Regarding the distance between the real object and the operation part of the hand, individual differences are considered to be small. However, depending on the accuracy of the detection unit 13, it may be difficult to determine whether or not the real object and the operation site of the hand are in contact with each other. Therefore, the control unit 14 determines the type of input operation by the user based on the state of the hand. Therefore, even in such a case, the control unit 14 can more easily and accurately determine the user's input operation. As a result, the user can perform input operations more accurately.
 さらに、制御部14は、手の状態がタッチ操作許容状態になっていると判定した場合には、入力操作がタッチ操作であると判定することができる。これにより、制御部14は、ユーザがタッチ操作を行ったか否かをより容易かつ正確に判定することができる。 Furthermore, when the control unit 14 determines that the state of the hand is in the touch operation permission state, it can determine that the input operation is the touch operation. Thereby, the control unit 14 can more easily and accurately determine whether the user has performed a touch operation.
 さらに、制御部14は、手に含まれるいずれかの指及び手のひらのうち、いずれか1種以上を操作部位に含める。したがって、制御部14は、操作部位が実オブジェクトに接近したか否かをより容易かつ正確に把握することができる。 Furthermore, the control unit 14 includes one or more of any finger and palm included in the hand as the operation site. Therefore, the control unit 14 can more easily and accurately grasp whether or not the operation part approaches the real object.
 さらに、制御部14は、人差し指を操作部位に含める。タッチ操作を行う場合、ユーザは、人差し指を使用することが比較的多いと考えられる。そして、制御部14は、人差し指と実オブジェクトとの距離を考慮して、入力操作の種類を判定する。したがって、制御部14は、ユーザが行う入力操作をより容易かつ正確に判定することができる。 Furthermore, the control unit 14 includes an index finger at the operation site. When performing a touch operation, the user is considered to use the forefinger relatively often. Then, the control unit 14 determines the type of the input operation in consideration of the distance between the index finger and the real object. Therefore, the control unit 14 can more easily and accurately determine the input operation performed by the user.
 さらに、制御部14は、手に含まれるいずれかの指を判定対象指とし、判定対象指の状態に基づいて、手の状態がタッチ操作許容状態であるか否かを判定する。したがって、制御部14は、手の状態をより詳細に判定することができ、ひいては、入力操作の種類をより容易かつ正確に判定することができる。 Furthermore, the control unit 14 determines any finger included in the hand as the determination target finger, and determines whether the state of the hand is the touch operation permission state based on the state of the determination target finger. Therefore, the control unit 14 can determine the state of the hand in more detail, and in turn, can more easily and accurately determine the type of the input operation.
 さらに、制御部14は、親指を判定対象指に含める。ここで、親指は、他の指に比べて短く、太い傾向がある。このような傾向は、個人差が少ないと考えられる。このため、他の指よりも検出が容易であることが多い。したがって、制御部14は、親指を判定対象指に含めることで、手の状態がタッチ操作許容状態であるか否かをより容易かつ正確に判定することができる。 Furthermore, the control unit 14 includes the thumb in the determination target finger. Here, the thumb tends to be shorter and thicker than the other fingers. Such a tendency is considered to be small among individuals. For this reason, it is often easier to detect than other fingers. Therefore, by including the thumb in the determination target finger, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
 さらに、制御部14は、親指の指先が手のひらに隠れている場合に、手の状態がタッチ操作許容状態であると判定する。上述したように、親指の指先が手のひらに隠れている場合とそうでない場合との形状の違いは明瞭である。したがって、制御部14は、手の状態がタッチ操作許容状態であるか否かをより容易かつ正確に判定することができる。 Furthermore, the control unit 14 determines that the state of the hand is the touch operation permitted state when the fingertip of the thumb is hidden by the palm. As mentioned above, the difference in shape between the case where the thumb's finger tip is hidden in the palm and the case where it is not so is clear. Therefore, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
 さらに、制御部14は、実オブジェクト上に画像を表示する制御を行う。したがって、ユーザは、画像を用いた入力操作をより正確に行うことができる。 Furthermore, the control unit 14 performs control to display an image on a real object. Therefore, the user can more accurately perform the input operation using the image.
 <3.第2の実施形態>
 つぎに、本開示の第2の実施形態について説明する。まず、図7~図9Bに基づいて、第2の実施形態に係る情報処理装置20の構成について説明する。
<3. Second embodiment>
Next, a second embodiment of the present disclosure will be described. First, the configuration of the information processing apparatus 20 according to the second embodiment will be described based on FIGS. 7 to 9B.
 情報処理装置20は、記憶部21、表示部22、検知部23、制御部24、フレーム部25、及び筐体部26を備える。情報処理装置20は、例えばアイウェア型の情報処理装置である。図8では情報処理装置20の形状はメガネ形状となっているが、ゴーグル形状等であってもよい。情報処理装置20は、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)、不揮発性メモリ、表示装置、検知装置、フレーム、及び筐体等のハードウェア構成を有する。 The information processing apparatus 20 includes a storage unit 21, a display unit 22, a detection unit 23, a control unit 24, a frame unit 25, and a housing unit 26. The information processing device 20 is, for example, an eyewear type information processing device. Although the shape of the information processing apparatus 20 is a glasses shape in FIG. 8, it may be a goggle shape or the like. The information processing apparatus 20 has a hardware configuration such as a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a non-volatile memory, a display device, a detection device, a frame, and a housing.
 ROMには、情報処理装置20の動作に必要な情報、例えばプログラム等が記録されている。CPUは、ROMに記録されているプログラムを読み出して実行する。表示装置は、メガネのレンズ部分に相当する構成となっており、透過型の表示装置となっている。表示装置は、表示面に各種の画像を表示する。ユーザは、表示装置に表示された画像と実空間上の実オブジェクトとを視認することができる。つまり、表示される画像は、いわゆる仮想オブジェクトとなる。表示装置は、各種画像(仮想オブジェクト)を実オブジェクト上に重畳して表示しても良い。検知装置は、第1の実施形態と同様である。フレームは、表示装置の枠体となるフロント部及びユーザの耳に掛けられるテンプル部等を備える。筐体は、各ハードウェア構成を内蔵するものであり、テンプル部に設けられる。筐体の設置位置はこの例に限られない。 The ROM stores information necessary for the operation of the information processing apparatus 20, such as a program. The CPU reads and executes the program stored in the ROM. The display device is configured to correspond to the lens portion of the glasses and is a transmissive display device. The display device displays various images on the display surface. The user can visually recognize the image displayed on the display device and the real object in the real space. That is, the displayed image is a so-called virtual object. The display device may display various images (virtual objects) superimposed on a real object. The detection device is similar to that of the first embodiment. The frame includes a front portion that is a frame of the display device, a temple portion that is hung on the user's ear, and the like. The housing incorporates each hardware configuration and is provided in the temple portion. The installation position of the housing is not limited to this example.
 これらのハードウェア構成により、記憶部21、表示部22、検知部23、制御部24、フレーム部25、及び筐体部26が実現される。なお、情報処理装置20は、パワーボタン等の操作ボタンをさらに備えていても良い。これらの操作ボタンは、ユーザによる入力操作が可能なものである。さらに、情報処理装置20は、他の情報処理装置と通信可能な通信装置等を備えていても良い。 The storage unit 21, the display unit 22, the detection unit 23, the control unit 24, the frame unit 25, and the housing unit 26 are realized by these hardware configurations. The information processing apparatus 20 may further include an operation button such as a power button. These operation buttons allow an input operation by the user. Furthermore, the information processing device 20 may include a communication device or the like that can communicate with other information processing devices.
 記憶部21は、例えばROM等で構成され、各種の情報を記憶する。例えば、記憶部21は、表示部22によって表示される画像、情報処理装置20の処理に必要なプログラム等を記憶する。 The storage unit 21 is configured by, for example, a ROM and stores various types of information. For example, the storage unit 21 stores an image displayed by the display unit 22, a program necessary for processing of the information processing apparatus 20, and the like.
 表示部22は、例えば表示装置等で構成される。表示部22は、画像を各種実オブジェクト(例えば図9A、図9Bに示す左手LH)に重畳して表示する。図9A、図9Bに示す例では、表示部22は、画像22Aを実オブジェクト(左手LH)に重畳して表示している。表示部22による画像の表示対象となる実オブジェクトの種類は特に問われない。例えば、表示部22は、画像を各種の壁、看板、標識等に重畳させて表示しても良い。また、表示部22が表示する画像の種類も特に問われない。例えば、表示部22は、ユーザによる入力操作の対象となる画像が表示されてもよい。 The display unit 22 includes, for example, a display device. The display unit 22 superimposes and displays an image on various real objects (for example, the left hand LH shown in FIGS. 9A and 9B). In the example illustrated in FIGS. 9A and 9B, the display unit 22 displays the image 22A superimposed on the real object (left hand LH). The type of real object to be displayed by the display unit 22 is not particularly limited. For example, the display unit 22 may superimpose an image on various walls, signs, signs, and the like. Further, the type of image displayed by the display unit 22 is not particularly limited. For example, the display unit 22 may display an image to be subjected to an input operation by the user.
 検知部23は、例えば検知装置等で構成される。検知部23の機能は第1の実施形態の検知部13と同様である。すなわち、検知部23は、例えばユーザの手及び実オブジェクトを撮像する。ここで、実オブジェクトは、画像が重畳されるオブジェクトである。そして、検知部23は、撮像画像に基づいて、ユーザの手(例えば、右手RH)の状態、及び位置を検知する。ここで、手の状態には、例えば手の形状が含まれる。手の位置には、指、手のひらの位置等が含まれる。さらに、検知部23は、撮像画像に基づいて、実オブジェクトの位置、形状等も検知する。そして、検知部23は、検知結果に関する検知情報を制御部24に出力する。 The detection unit 23 includes, for example, a detection device. The function of the detection unit 23 is the same as that of the detection unit 13 of the first embodiment. That is, the detection unit 23 captures, for example, the user's hand and a real object. Here, a real object is an object on which an image is superimposed. Then, the detection unit 23 detects the state and the position of the user's hand (for example, the right hand RH) based on the captured image. Here, the state of the hand includes, for example, the shape of the hand. The position of the hand includes the position of the finger, the palm, and the like. Furthermore, the detection unit 23 also detects the position, the shape, and the like of the real object based on the captured image. Then, the detection unit 23 outputs detection information on the detection result to the control unit 24.
 制御部24は、例えばCPU(プロセッサ)等で構成され、情報処理装置20の各構成要素を制御する。例えば、制御部24は、表示部22に各種画像を表示させる。さらに、制御部24は、検知情報に基づいて、実オブジェクトと手の操作部位との距離を認識する。ここで、実オブジェクトは、画像が重畳されるオブジェクトであり、例えばユーザの手等である。 The control unit 24 is configured of, for example, a CPU (processor) or the like, and controls each component of the information processing apparatus 20. For example, the control unit 24 causes the display unit 22 to display various images. Furthermore, the control unit 24 recognizes the distance between the real object and the operation site of the hand based on the detection information. Here, the real object is an object on which an image is superimposed, and is, for example, a user's hand or the like.
 操作部位は、ユーザが入力操作を行う部位である。具体的な内容は第1の実施形態と同様である。すなわち、制御部24は、手に含まれるいずれかの指及び手のひらのうち、いずれか1種以上を前記操作部位に含める。より具体的には、制御部24は、人差し指を操作部位に含める。より具体的には、制御部24は、人差し指の指先を操作部位に含める。例えば、制御部24は、右手RHの人差し指RH2の指先RH21を操作部位に含める。もちろん、操作部位はこの限りではなく、他の指の指先、手のひら、左手等であってもよい。操作部位は、予め設定されていても良いし、ユーザが任意に設定しても良い。操作部位の変形例については後述する。 The operation site is a site where the user performs an input operation. The specific content is the same as that of the first embodiment. That is, the control unit 24 includes one or more of any finger and palm included in the hand in the operation site. More specifically, the control unit 24 includes an index finger at the operation site. More specifically, the control unit 24 includes the fingertip of the index finger at the operation site. For example, the control unit 24 includes the fingertip RH21 of the forefinger RH2 of the right hand RH in the operation site. Of course, the operation site is not limited to this, and may be the fingertip of another finger, palm, left hand or the like. The operation site may be set in advance or may be set arbitrarily by the user. The modification of the operation part is mentioned later.
 そして、制御部24は、実オブジェクトと手の操作部位との距離が所定範囲内となる場合に、手の状態に基づいて、入力操作の種類を判定する。例えば、制御部24は、手の状態に基づいて、入力操作がタッチ操作であるか否かを判定する。これにより、制御部24は、ユーザがタッチ操作を行っているか否かを容易かつ正確に判定することができる。具体的な判定方法については後述する。そして、制御部24は、入力操作に応じた処理を行う。フレーム部25は、上述したフレームで構成される。筐体部26は、例えば上述した筐体で構成され、ハードウェア構成、言い換えれば、記憶部21、表示部22、検知部23、及び制御部24を内蔵する。 Then, the control unit 24 determines the type of the input operation based on the state of the hand when the distance between the real object and the operation site of the hand falls within a predetermined range. For example, the control unit 24 determines whether the input operation is a touch operation based on the state of the hand. Thus, the control unit 24 can easily and accurately determine whether the user is performing a touch operation. The specific determination method will be described later. And control part 24 performs processing according to input operation. The frame unit 25 is configured of the above-described frame. The housing unit 26 includes, for example, the above-described housing, and incorporates the hardware configuration, in other words, the storage unit 21, the display unit 22, the detection unit 23, and the control unit 24.
 (3-2.情報処理装置による処理)
 情報処理装置20による処理の手順は第1の実施形態とほぼ同様である。そこで、情報処理装置20による処理の手順を図4に示すフローチャートに沿って説明する。
(3-2. Processing by the information processing apparatus)
The procedure of processing by the information processing apparatus 20 is substantially the same as that of the first embodiment. Then, the procedure of the process by the information processing apparatus 20 is demonstrated along the flowchart shown in FIG.
 ステップS10において、表示部22は、画像を実オブジェクトに重畳して表示する。ユーザは、画像に対して入力操作を行うために、例えば右手RHの人差し指RH2を画像が重畳された実オブジェクトに近づける。検知部23は、ユーザの手及び実オブジェクト(画像が重畳された実オブジェクト)を撮像する。ユーザが右手RHを画像に近づけた場合、検知部13は、ユーザの右手RH及び実オブジェクトを撮像する。 In step S10, the display unit 22 superimposes and displays an image on a real object. The user brings, for example, a forefinger RH2 of the right hand RH close to a real object on which the image is superimposed, in order to perform an input operation on the image. The detection unit 23 captures an image of the user's hand and a real object (a real object on which an image is superimposed). When the user brings the right hand RH close to the image, the detection unit 13 captures an image of the user's right hand RH and a real object.
 ついで、検知部23は、撮像画像に基づいて、ユーザの手の状態及び位置を検知する。さらに、検知部23は、実オブジェクトの位置及び形状等も検知する。ついで、検知部23は、検知結果に関する検知情報を制御部24に出力する。 Subsequently, the detection unit 23 detects the state and the position of the user's hand based on the captured image. Furthermore, the detection unit 23 also detects the position and the shape of the real object. Next, the detection unit 23 outputs detection information on the detection result to the control unit 24.
 制御部24は、検知情報に基づいて、実オブジェクトと手の操作部位までの距離を特定する。操作部位が右手RHの人差し指RH2の指先RH21となっている場合、制御部14は、実オブジェクトと指先RH21との距離を特定する。 The control unit 24 specifies the distance between the real object and the operation part of the hand based on the detection information. When the operation site is the fingertip RH21 of the forefinger RH2 of the right hand RH, the control unit 14 specifies the distance between the real object and the fingertip RH21.
 ついで、制御部24は、実オブジェクトと手の操作部位との距離が所定範囲内であるか否かを判定する。ここでの所定範囲は特に制限されない。ただし、後述する処理において、制御部24は、入力操作がタッチ操作であるか否かを判定する。したがって、ここでの所定範囲は、ある程度小さいことが好ましい。例えば、所定範囲は、数センチ程度であってもよい。所定範囲は、予め設定されていても良いし、ユーザによって設定されても良い。 Next, the control unit 24 determines whether the distance between the real object and the operation site of the hand is within a predetermined range. The predetermined range here is not particularly limited. However, in processing to be described later, the control unit 24 determines whether the input operation is a touch operation. Therefore, it is preferable that the predetermined range here be small to some extent. For example, the predetermined range may be about several centimeters. The predetermined range may be set in advance or may be set by the user.
 制御部24は、実オブジェクトと操作部位との距離が所定範囲内であると判定した場合には、ステップS20に進む。その後、制御部14は、手の状態に基づいて、入力操作の種類を判定する。制御部24は、実オブジェクトと手の操作部位との距離が所定範囲を超えている場合には、本処理を終了する。 If the control unit 24 determines that the distance between the real object and the operation site is within the predetermined range, the process proceeds to step S20. Thereafter, the control unit 14 determines the type of input operation based on the state of the hand. When the distance between the real object and the operation part of the hand exceeds the predetermined range, the control unit 24 ends this processing.
 ステップS20において、制御部14は、検知情報に基づいて、手の状態を判定する。操作部位が右手RHに含まれる場合、制御部24は、右手RHの状態を特定する。 In step S20, the control unit 14 determines the state of the hand based on the detection information. When the operation site is included in the right hand RH, the control unit 24 specifies the state of the right hand RH.
 ステップS30において、制御部24は、手の状態がタッチ操作許容状態になっているか否かを判定する。ここで、タッチ操作許容状態は、ユーザにタッチ操作を許容させる際の手の状態を意味する。 In step S30, the control unit 24 determines whether the state of the hand is in the touch operation permission state. Here, the touch operation permitting state means the state of the hand when the user is permitted to perform the touch operation.
 具体的には、制御部24は、手に含まれるいずれかの指を判定対象指とする。制御部24は、親指を判定対象指に含めてもよい。そして、制御部24は、親指の指先が手のひらに隠れている場合、手の状態がタッチ操作許容状態であると判定してもよい。ここで、親指の指先が手のひらに隠れている状態とは、例えば、親指の指先が手のひらに対向する位置に存在することを意味する。検知部13が手の甲側から手を撮像した場合、親指は手のひらに隠れるため、撮像画像に親指はほとんど映らない。一方、検知部13が手のひら側から手を撮像した場合、手のひら内に存在する親指が撮像される。 Specifically, the control unit 24 sets any finger included in the hand as a determination target finger. The control unit 24 may include the thumb in the determination target finger. And control part 24 may judge that a state of a hand is a touch operation permission state, when a finger tip of a thumb is hiding in a palm. Here, the state where the fingertip of the thumb is hidden in the palm means that the fingertip of the thumb is present at a position facing the palm, for example. When the detection unit 13 captures an image of the hand from the back side of the hand, the thumb hides in the palm, so the thumb hardly appears in the captured image. On the other hand, when the detection unit 13 captures a hand from the palm side, the thumb present in the palm is captured.
 制御部24は、手の状態がタッチ操作許容状態になっていると判定した場合、ユーザが行う入力操作はタッチ操作であると判定し、ステップS40に進む。一方、制御部24は、手の状態がタッチ操作許容状態になっていないと判定した場合、ユーザが行う入力操作はタッチ操作以外の操作であると判定し、ステップS50に進む。このように、制御部24は、手の状態に基づいて、入力操作の種類を判定する。 If the control unit 24 determines that the state of the hand is in the touch operation permission state, the control unit 24 determines that the input operation performed by the user is a touch operation, and the process proceeds to step S40. On the other hand, when the control unit 24 determines that the state of the hand is not in the touch operation permission state, the control unit 24 determines that the input operation performed by the user is an operation other than the touch operation, and proceeds to step S50. Thus, the control unit 24 determines the type of input operation based on the state of the hand.
 ステップS40において、制御部24は、ユーザによるタッチ操作を受け付ける。すなわち、制御部24は、ユーザが操作部位を用いてタッチ操作を行った場合、当該タッチ操作に応じた処理を行う。例えば、図9Aに示すように、ユーザが人差し指RH2の指先R21を画像22A上で移動させた場合、制御部24は、移動軌跡上に線画像22Bを描く。その後、制御部24は、本処理を終了する。 In step S40, the control unit 24 receives a touch operation by the user. That is, when the user performs a touch operation using the operation site, the control unit 24 performs a process according to the touch operation. For example, as shown in FIG. 9A, when the user moves the fingertip R21 of the forefinger RH2 on the image 22A, the control unit 24 draws a line image 22B on the movement trajectory. Thereafter, the control unit 24 ends the present process.
 一方、ステップS50において、制御部24は、タッチ操作を受け付けない。例えば、図9Bに示すように、ユーザが人差し指RH2の指先R21を画像22A上で移動させた場合であっても、制御部24は、これに応じた処理を行わない。その一方で、制御部24は、タッチ操作以外の入力操作を受け付けてもよい。例えば、制御部24は、ユーザによるホバー操作(画像上で手を移動させる操作)を行った場合に、当該ホバー操作に応じた処理(例えば、ページ送り処理等)を行っても良い。その後、制御部24は、本処理を終了する。 On the other hand, in step S50, the control unit 24 does not receive the touch operation. For example, as shown in FIG. 9B, even when the user moves the fingertip R21 of the forefinger RH2 on the image 22A, the control unit 24 does not perform processing according to the movement. On the other hand, the control unit 24 may receive an input operation other than the touch operation. For example, when the user performs a hover operation (an operation of moving a hand on the image), the control unit 24 may perform a process (for example, a page feed process or the like) according to the hover operation. Thereafter, the control unit 24 ends the present process.
 第2の実施形態によれば、第1の実施形態と同様の効果が得られる。さらに、第2の実施形態によれば、実オブジェクト上に仮想オブジェクトを重畳する場合であっても、ユーザが行った入力操作をより容易かつ正確に判定することができる。 According to the second embodiment, the same effect as that of the first embodiment can be obtained. Furthermore, according to the second embodiment, even in the case of superimposing a virtual object on a real object, the input operation performed by the user can be determined more easily and accurately.
 ここで、第2の実施形態では、実オブジェクト上に仮想オブジェクトを重畳し、実オブジェクトと操作部位との距離を測定する。しかしながら、第2の実施形態はこれに限られず、例えば、仮想オブジェクトのみを実空間内に配置しても良い。つまり、表示部22は、仮想オブジェクトを実オブジェクトと重畳しない位置に表示してもよい。この場合、制御部24は、仮想オブジェクトの実空間内での位置を設定しておく。そして、制御部24は、仮想オブジェクトの設定位置と操作部位との距離を用いて、上述した処理を行っても良い。この場合、制御部24は、仮想オブジェクトを実オブジェクトに重畳させる必要がないので、より多様な仮想オブジェクトをユーザに提供することができる。さらに、制御部24は、これらの仮想オブジェクトに対する入力操作を正確に判定することができる。 Here, in the second embodiment, the virtual object is superimposed on the real object, and the distance between the real object and the operation site is measured. However, the second embodiment is not limited to this, and for example, only virtual objects may be arranged in the real space. That is, the display unit 22 may display the virtual object at a position not overlapping the real object. In this case, the control unit 24 sets the position of the virtual object in the real space. Then, the control unit 24 may perform the above-described processing using the distance between the setting position of the virtual object and the operation site. In this case, since the control unit 24 does not need to superimpose the virtual object on the real object, the control unit 24 can provide more various virtual objects to the user. Furthermore, the control unit 24 can accurately determine the input operation on these virtual objects.
 <4.各種変形例>
 (4-1.第1の変形例)
 上述した第1及び第2の実施形態では、親指の指先が手のひらに隠れているか否かに基づいて、手の状態がタッチ操作許容状態であるか否かを判定した。しかしながら、手の状態がタッチ操作許容状態であるか否かを判定する方法は上記に限られない。例えば、人差し指が操作部位となっている場合、制御部14、24は、親指の指先が人差し指以外の指に触れているか否かによって、手の状態がタッチ操作許容状態であるか否かを判定してもよい。例えば、図10Aに示す例では、人差し指RH2が操作部位となっており、親指RH1が中指RH3、薬指RH4、及び小指RH5の何れか(この例では中指RH3)に触れている。この場合、制御部14、24は、手の状態(右手RHの状態)がタッチ操作許容状態であると判定しても良い。一方、図10Bに示す例では、人差し指RH2が操作部位となっており、親指RH1が中指RH3、薬指RH4、及び小指RH5から離れている。この場合、制御部14、24は、手の状態(右手RHの状態)がタッチ操作許容状態でないと判定しても良い。
<4. Various modifications>
(4-1. First Modification)
In the first and second embodiments described above, it is determined whether or not the state of the hand is the touch operation permitted state based on whether or not the finger tip of the thumb is hidden by the palm. However, the method of determining whether the hand state is the touch operation permission state is not limited to the above. For example, when the index finger is the operation site, the control units 14 and 24 determine whether the state of the hand is the touch operation permitted state or not depending on whether the finger of the thumb touches a finger other than the index finger. You may For example, in the example shown in FIG. 10A, the index finger RH2 is the operation site, and the thumb RH1 touches any one of the middle finger RH3, ring finger RH4 and little finger RH5 (in this example, middle finger RH3). In this case, the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is the touch operation permission state. On the other hand, in the example shown in FIG. 10B, the index finger RH2 is the operation site, and the thumb RH1 is separated from the middle finger RH3, the ring finger RH4, and the little finger RH5. In this case, the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is not the touch operation permission state.
 また、人差し指が操作部位となっている場合、制御部14、24は、親指の指先が人差し指に触れているか否かによって、手の状態がタッチ操作許容状態であるか否かを判定してもよい。具体的には、制御部14、24は、親指の指先が人差し指に触れている場合に、手の状態がタッチ操作許容状態であると判定しても良い。制御部14、24は、親指の指先が人差し指に触れていない場合に、手の状態がタッチ操作許容状態でないと判定しても良い。この場合にも、上記と同様の効果が得られる。第1の変形例によっても、第1及び第2の実施形態と同様の効果が得られる。 In addition, when the index finger is the operation portion, the control units 14 and 24 determine whether the state of the hand is the touch operation permitted state or not depending on whether the finger of the thumb touches the index finger. Good. Specifically, when the fingertip of the thumb touches the forefinger, the control units 14 and 24 may determine that the state of the hand is the touch operation permitted state. The control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the fingertip of the thumb does not touch the forefinger. Also in this case, the same effect as described above can be obtained. The same effects as those of the first and second embodiments can be obtained by the first modification as well.
 (4-2.第2の変形例)
 上述した第1及び第2の実施形態では、親指を判定対象指としたが、他の指を判定対象指としてもよい。例えば、制御部14、24は、中指、薬指、及び小指のうち、いずれか1種以上の指を判定対象指に含めてもよい。そして、制御部14、24は、判定対象指の指先が手のひらに隠れているか否かによって、手の状態がタッチ操作許容状態であるか否かを判定してもよい。具体的には、制御部14、24は、判定対象指の指先が手のひらに隠れている場合に、手の状態がタッチ操作許容状態であると判定しても良い。制御部14、24は、判定対象指の指先が手のひらから離れている場合に、手の状態がタッチ操作許容状態でないと判定しても良い。図11に示す例では、中指RH3、薬指RH4、及び小指RH5の何れか1種以上が判定対象指となっている。この例では、中指RH3、薬指RH4、及び小指RH5の指先がいずれも手のひらから離れているので、制御部14、24は、手の状態(右手RHの状態)がタッチ操作許容状態でないと判定する。
(4-2. Second Modification)
In the first and second embodiments described above, the thumb is used as the determination target finger, but another finger may be used as the determination target finger. For example, the control units 14 and 24 may include one or more fingers among the middle finger, the ring finger, and the little finger in the determination target finger. Then, the control units 14 and 24 may determine whether or not the state of the hand is in the touch operation permission state based on whether the fingertip of the determination target finger is hidden in the palm. Specifically, when the fingertip of the determination target finger is hidden by the palm, the control units 14 and 24 may determine that the state of the hand is the touch operation permission state. The control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the fingertip of the determination target finger is separated from the palm. In the example illustrated in FIG. 11, one or more of the middle finger RH3, the ring finger RH4, and the little finger RH5 are the determination target fingers. In this example, since the fingertips of the middle finger RH3, ring finger RH4 and little finger RH5 are all away from the palm, the control units 14 and 24 determine that the state of the hand (the state of the right hand RH) is not the touch operation permission state .
 また、人差し指が操作部位となっており、かつ、中指が判定対象指となっている場合、制御部14、24は、中指が人差し指に触れているか否かによって、手の状態がタッチ操作許容状態であるか否かを判定してもよい。具体的には、制御部14、24は、中指が人差し指に触れている場合に、手の状態がタッチ操作許容状態であると判定しても良い。制御部14、24は、中指が人差し指に触れていない場合に、手の状態がタッチ操作許容状態でないと判定しても良い。図12に示す例では、中指RH3が人差し指RH2に触れている。この場合、制御部14、24は、手の状態(右手RHの状態)がタッチ操作許容状態であると判定しても良い。 In addition, when the index finger is the operation site and the middle finger is the determination target finger, the control unit 14 or 24 controls the state of the hand in the touch operation permitted state depending on whether the middle finger is touching the index finger. It may be determined whether or not. Specifically, when the middle finger touches the forefinger, the control units 14 and 24 may determine that the state of the hand is the touch operation permission state. The control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the middle finger does not touch the index finger. In the example shown in FIG. 12, the middle finger RH3 touches the forefinger RH2. In this case, the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is the touch operation permission state.
 第2の変形例によっても、第1及び第2の実施形態と同様の効果が得られる。ただし、第1及び第2の実施形態のように、親指を判定対象指とした場合のほうが、手の状態がタッチ操作許容状態であるか否かを容易に判定できる可能性がある。 Also in the second modification, the same effect as in the first and second embodiments can be obtained. However, as in the first and second embodiments, when the thumb is used as the determination target finger, it may be possible to easily determine whether the state of the hand is the touch operation permitted state.
 (4-3.第3の変形例)
 上述した第1及び第2の実施形態では、人差し指を操作部位としたが、他の部位を操作部位としてもよい。例えば、図13に示すように、右手RHの小指RH5を操作部位としてもよいし、図14に示すように、右手RHの親指RH1を操作部位としてもよい。この場合、制御部14、24は、操作部位以外の指(図13の例では親指RH1、図14の例では人差し指RH2等)の状態に基づいて、手の状態(右手RHの状態)がタッチ操作許容状態であるか否かを判定すれば良い。第3の変形例によっても、第1及び第2の実施形態と同様の効果が得られる。
(4-3. Third Modified Example)
In the first and second embodiments described above, the index finger is used as the operation part, but another part may be used as the operation part. For example, as shown in FIG. 13, the little finger RH5 of the right hand RH may be used as the operation site, and as shown in FIG. 14, the thumb RH1 of the right hand RH may be used as the operation site. In this case, the control units 14 and 24 control the state of the hand (the state of the right hand RH) based on the state of the finger other than the operation site (thumb RH1 in the example of FIG. It may be determined whether or not the operation is permitted. Also in the third modification, the same effect as in the first and second embodiments can be obtained.
 なお、上記の各変形例と第1及び第2の実施形態とは任意に組み合わされても良い。例えば、ステップS30の処理において、制御部14、24は、第1及び第2の実施形態で示した条件と、第1~第3の変形例に示した条件のいずれか1つ以上が満たされる場合に、手の状態がタッチ操作許容状態であると判定しても良い。 Each of the above-described modifications and the first and second embodiments may be arbitrarily combined. For example, in the process of step S30, the control units 14 and 24 satisfy one or more of the conditions described in the first and second embodiments and the conditions described in the first to third modifications. In this case, it may be determined that the state of the hand is the touch operation permission state.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。例えば、第1の実施形態及び第2の実施形態では、投影型の情報処理装置及びアイウェア型の情報処理装置を説明したが、本開示の技術的範囲はこれらの情報処理装置に限定されるものではない。 The preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that those skilled in the art of the present disclosure can conceive of various modifications or alterations within the scope of the technical idea described in the claims. It is understood that also of course falls within the technical scope of the present disclosure. For example, in the first and second embodiments, the projection type information processing apparatus and the eyewear type information processing apparatus have been described, but the technical scope of the present disclosure is limited to these information processing apparatuses. It is not a thing.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in the present specification are merely illustrative or exemplary, and not limiting. That is, the technology according to the present disclosure can exhibit other effects apparent to those skilled in the art from the description of the present specification, in addition to or instead of the effects described above.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 オブジェクトと手の操作部位との距離が所定範囲内となる場合に、前記手の状態に基づいて、入力操作の種類を判定する制御部を備える、情報処理装置。
(2)
 前記制御部は、前記手の状態がタッチ操作許容状態になっていると判定した場合には、前記入力操作がタッチ操作であると判定する、前記(1)記載の情報処理装置。
(3)
 前記制御部は、前記手に含まれるいずれかの指及び手のひらのうち、いずれか1種以上を前記操作部位に含める、前記(2)記載の情報処理装置。
(4)
 前記制御部は、人差し指を前記操作部位に含める、前記(3)記載の情報処理装置。
(5)
 前記制御部は、前記手に含まれるいずれかの指を判定対象指とし、前記判定対象指の状態に基づいて、前記手の状態が前記タッチ操作許容状態であるか否かを判定する、前記(3)または(4)記載の情報処理装置。
(6)
 前記制御部は、親指を前記判定対象指に含める、前記(5)記載の情報処理装置。
(7)
 前記制御部は、前記親指の指先が手のひらに隠れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、前記(6)記載の情報処理装置。
(8)
 前記制御部は、人差し指を前記操作部位に含め、前記親指の指先が前記人差し指以外の指に触れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、前記(7)記載の情報処理装置。
(9)
 前記制御部は、人差し指を前記操作部位に含め、前記親指の指先が前記人差し指に触れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、前記(6)~(8)の何れか1項に記載の情報処理装置。
(10)
 前記制御部は、中指、薬指、及び小指のうち、いずれか1種以上の指を判定対象指に含める、前記(5)~(9)の何れか1項に記載の情報処理装置。
(11)
 前記制御部は、前記判定対象指の指先が手のひらに隠れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、前記(10)記載の情報処理装置。
(12)
 前記制御部は、人差し指を前記操作部位に含め、かつ、前記中指を前記判定対象指に含め、前記中指が前記人差し指に触れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、前記(10)または(11)に記載の情報処理装置。
(13)
 前記制御部は、実空間内に存在する実オブジェクトと前記操作部位との距離が所定範囲内となる場合に、前記手の状態に基づいて、入力操作の種類を判定する、前記(1)~(12)の何れか1項に記載の情報処理装置。
(14)
 前記制御部は、前記実オブジェクト上に前記入力操作の対象となる画像を表示する制御を行う、前記(13)記載の情報処理装置。
(15)
 前記制御部は、仮想オブジェクトを前記実オブジェクトに重畳して表示する制御を行う、前記(13)または(14)に記載の情報処理装置。
(16)
 前記制御部は、実空間内に仮想オブジェクトを配置し、前記仮想オブジェクトと前記操作部位との距離が所定範囲内となる場合に、前記手の状態がタッチ操作許容状態であるか否かを判定する、前記(1)~(15)の何れか1項に記載の情報処理装置。
(17)
 前記制御部は、前記入力操作に応じた処理を行う、前記(1)~(16)の何れか1項に記載の情報処理装置。
(18)
 プロセッサが、オブジェクトと手の操作部位との距離が所定範囲内となる場合に、前記手の状態に基づいて、入力操作の種類を判定する、情報処理方法。
(19)
 コンピュータに、
 オブジェクトと手の操作部位との距離が所定範囲内となる場合に、前記手の状態に基づいて、入力操作の種類を判定する制御機能を実現させる、プログラム。
The following configurations are also within the technical scope of the present disclosure.
(1)
An information processing apparatus, comprising: a control unit that determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
(2)
The information processing apparatus according to (1), wherein the control unit determines that the input operation is a touch operation when the control unit determines that the state of the hand is in a touch operation permission state.
(3)
The information processing apparatus according to (2), wherein the control unit includes, in the operation portion, any one or more of any finger and palm included in the hand.
(4)
The information processing apparatus according to (3), wherein the control unit includes an index finger in the operation portion.
(5)
The control unit determines any finger included in the hand as a determination target finger, and determines whether the state of the hand is the touch operation permitted state based on the state of the determination target finger. The information processing apparatus according to (3) or (4).
(6)
The information processing apparatus according to (5), wherein the control unit includes a thumb in the determination target finger.
(7)
The information processing apparatus according to (6), wherein the control unit determines that the state of the hand is the touch operation permission state when the fingertip of the thumb is hidden by a palm.
(8)
The control unit determines that the state of the hand is the touch operation permitted state when the index finger is included in the operation portion and the fingertip of the thumb touches a finger other than the index finger. Information processor as described.
(9)
The control unit determines that the state of the hand is the touch operation permitted state when the index finger is included in the operation portion and the fingertip of the thumb touches the index finger. The information processing apparatus according to any one of the above.
(10)
The information processing apparatus according to any one of (5) to (9), wherein the control unit includes, in the determination target finger, at least one finger of the middle finger, the ring finger, and the little finger.
(11)
The information processing apparatus according to (10), wherein the control unit determines that the state of the hand is the touch operation permission state when the fingertip of the determination target finger is hidden by a palm.
(12)
The control unit includes a forefinger in the operation portion, includes the middle finger in the determination target finger, and the touch operation is permitted when the middle finger touches the forefinger. The information processing apparatus according to (10) or (11), which determines.
(13)
The control unit determines the type of the input operation based on the state of the hand, when the distance between the real object present in the real space and the operation portion falls within a predetermined range. The information processing apparatus according to any one of (12).
(14)
The information processing apparatus according to (13), wherein the control unit performs control to display an image to be an object of the input operation on the real object.
(15)
The information processing apparatus according to (13) or (14), wherein the control unit performs control to superimpose and display a virtual object on the real object.
(16)
The control unit determines whether the state of the hand is in the touch operation permission state when the virtual object is disposed in the real space and the distance between the virtual object and the operation site falls within a predetermined range. The information processing apparatus according to any one of (1) to (15).
(17)
The information processing apparatus according to any one of (1) to (16), wherein the control unit performs a process according to the input operation.
(18)
An information processing method, wherein the processor determines the type of input operation based on the state of the hand when the distance between the object and the operation part of the hand falls within a predetermined range.
(19)
On the computer
A program that implements a control function of determining the type of input operation based on the state of the hand when the distance between the object and the operation part of the hand falls within a predetermined range.
 10、20   情報処理装置
 11、21   記憶部
 12、22   表示部
 13、23   検知部
 14、24   制御部
 RH    右手
 RH1   親指
 RH2   人差し指
 RH3   中指
 RH4   薬指
 RH5   小指
10, 20 information processing apparatus 11, 21 storage unit 12, 22 display unit 13, 23 detection unit 14, 24 control unit RH right hand RH1 thumb RH2 forefinger RH3 middle finger RH4 ring finger RH5 little finger

Claims (19)

  1.  オブジェクトと手の操作部位との距離が所定範囲内となる場合に、前記手の状態に基づいて、入力操作の種類を判定する制御部を備える、情報処理装置。 An information processing apparatus, comprising: a control unit that determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
  2.  前記制御部は、前記手の状態がタッチ操作許容状態になっていると判定した場合には、前記入力操作がタッチ操作であると判定する、請求項1記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit determines that the input operation is a touch operation when the control unit determines that the state of the hand is in a touch operation permission state.
  3.  前記制御部は、前記手に含まれるいずれかの指及び手のひらのうち、いずれか1種以上を前記操作部位に含める、請求項2記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the control unit includes, in the operation portion, any one or more types of any finger and palm included in the hand.
  4.  前記制御部は、人差し指を前記操作部位に含める、請求項3記載の情報処理装置。 The information processing apparatus according to claim 3, wherein the control unit includes a forefinger in the operation portion.
  5.  前記制御部は、前記手に含まれるいずれかの指を判定対象指とし、前記判定対象指の状態に基づいて、前記手の状態が前記タッチ操作許容状態であるか否かを判定する、請求項3記載の情報処理装置。 The control unit determines any finger included in the hand as a determination target finger, and determines whether or not the state of the hand is the touch operation permitted state based on the state of the determination target finger. The information processing apparatus according to Item 3.
  6.  前記制御部は、親指を前記判定対象指に含める、請求項5記載の情報処理装置。 The information processing apparatus according to claim 5, wherein the control unit includes a thumb in the determination target finger.
  7.  前記制御部は、前記親指の指先が手のひらに隠れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、請求項6記載の情報処理装置。 The information processing apparatus according to claim 6, wherein the control unit determines that the state of the hand is the touch operation permitted state when the fingertip of the thumb is hidden by the palm of the hand.
  8.  前記制御部は、人差し指を前記操作部位に含め、前記親指の指先が前記人差し指以外の指に触れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、請求項7記載の情報処理装置。 8. The control unit according to claim 7, wherein the control unit determines that the state of the hand is the touch operation permission state when the index finger is included in the operation portion and the finger of the thumb touches a finger other than the index finger. Information processing equipment.
  9.  前記制御部は、人差し指を前記操作部位に含め、前記親指の指先が前記人差し指に触れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、請求項6記載の情報処理装置。 The information processing according to claim 6, wherein the control unit determines that the state of the hand is the touch operation permitted state when the index finger is included in the operation portion and the fingertip of the thumb touches the index finger. apparatus.
  10.  前記制御部は、中指、薬指、及び小指のうち、いずれか1種以上の指を判定対象指に含める、請求項5記載の情報処理装置。 The information processing apparatus according to claim 5, wherein the control unit includes, in the determination target finger, one or more types of fingers among the middle finger, the ring finger, and the little finger.
  11.  前記制御部は、前記判定対象指の指先が手のひらに隠れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、請求項10記載の情報処理装置。 11. The information processing apparatus according to claim 10, wherein the control unit determines that the state of the hand is the touch operation permission state when the fingertip of the determination target finger is hidden by the palm.
  12.  前記制御部は、人差し指を前記操作部位に含め、かつ、前記中指を前記判定対象指に含め、前記中指が前記人差し指に触れている場合に、前記手の状態が前記タッチ操作許容状態であると判定する、請求項10記載の情報処理装置。 The control unit includes a forefinger in the operation portion, includes the middle finger in the determination target finger, and the touch operation is permitted when the middle finger touches the forefinger. The information processing apparatus according to claim 10, wherein the information processing apparatus determines.
  13.  前記制御部は、実空間内に存在する実オブジェクトと前記操作部位との距離が所定範囲内となる場合に、前記手の状態に基づいて、入力操作の種類を判定する、請求項1記載の情報処理装置。 The said control part determines the kind of input operation based on the state of the said hand, when the distance of the real object which exists in real space, and the said operation site | part becomes in a predetermined range. Information processing device.
  14.  前記制御部は、前記実オブジェクト上に前記入力操作の対象となる画像を表示する制御を行う、請求項13記載の情報処理装置。 The information processing apparatus according to claim 13, wherein the control unit performs control to display an image to be an object of the input operation on the real object.
  15.  前記制御部は、仮想オブジェクトを前記実オブジェクトに重畳して表示する制御を行う、請求項13記載の情報処理装置。 The information processing apparatus according to claim 13, wherein the control unit performs control of superimposing and displaying a virtual object on the real object.
  16.  前記制御部は、実空間内に仮想オブジェクトを配置し、前記仮想オブジェクトと前記操作部位との距離が所定範囲内となる場合に、前記手の状態がタッチ操作許容状態であるか否かを判定する、請求項1記載の情報処理装置。 The control unit determines whether the state of the hand is in the touch operation permission state when the virtual object is disposed in the real space and the distance between the virtual object and the operation site falls within a predetermined range. The information processing apparatus according to claim 1.
  17.  前記制御部は、前記入力操作に応じた処理を行う、請求項1記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit performs a process according to the input operation.
  18.  プロセッサが、オブジェクトと手の操作部位との距離が所定範囲内となる場合に、前記手の状態に基づいて、入力操作の種類を判定する、情報処理方法。 An information processing method, wherein the processor determines the type of input operation based on the state of the hand when the distance between the object and the operation part of the hand falls within a predetermined range.
  19.  コンピュータに、
     オブジェクトと手の操作部位との距離が所定範囲内となる場合に、前記手の状態に基づいて、入力操作の種類を判定する制御機能を実現させる、プログラム。
    On the computer
    A program that implements a control function of determining the type of input operation based on the state of the hand when the distance between the object and the operation part of the hand falls within a predetermined range.
PCT/JP2018/023548 2017-08-25 2018-06-21 Information processing device, information processing method, and program WO2019039065A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-162074 2017-08-25
JP2017162074 2017-08-25

Publications (1)

Publication Number Publication Date
WO2019039065A1 true WO2019039065A1 (en) 2019-02-28

Family

ID=65440016

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/023548 WO2019039065A1 (en) 2017-08-25 2018-06-21 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2019039065A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021075103A1 (en) * 2019-10-17 2021-04-22 ソニー株式会社 Information processing device, information processing method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004258714A (en) * 2003-02-24 2004-09-16 Toshiba Corp Device for displaying picture
JP2012068854A (en) * 2010-09-22 2012-04-05 Shimane Prefecture Operation input device and operation determination method and program
US20150040040A1 (en) * 2013-08-05 2015-02-05 Alexandru Balan Two-hand interaction with natural user interface
US20150058782A1 (en) * 2013-08-21 2015-02-26 Intel Corporation System and method for creating and interacting with a surface display
JP2017073128A (en) * 2015-10-08 2017-04-13 船井電機株式会社 Space input device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004258714A (en) * 2003-02-24 2004-09-16 Toshiba Corp Device for displaying picture
JP2012068854A (en) * 2010-09-22 2012-04-05 Shimane Prefecture Operation input device and operation determination method and program
US20150040040A1 (en) * 2013-08-05 2015-02-05 Alexandru Balan Two-hand interaction with natural user interface
US20150058782A1 (en) * 2013-08-21 2015-02-26 Intel Corporation System and method for creating and interacting with a surface display
JP2017073128A (en) * 2015-10-08 2017-04-13 船井電機株式会社 Space input device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021075103A1 (en) * 2019-10-17 2021-04-22 ソニー株式会社 Information processing device, information processing method, and program
US12216830B2 (en) 2019-10-17 2025-02-04 Sony Group Corporation Information processing apparatus and information processing method

Similar Documents

Publication Publication Date Title
US20190250714A1 (en) Systems and methods for triggering actions based on touch-free gesture detection
JP2022118183A (en) Systems and methods of direct pointing detection for interaction with digital device
EP2817693B1 (en) Gesture recognition device
EP2943836B1 (en) Head mount display device providing eye gaze calibration and control method thereof
US11360605B2 (en) Method and device for providing a touch-based user interface
CN105144072B (en) Pressure sensitivity is simulated on multi-point touch control apparatus
TW201633066A (en) 3D visualization
US9916043B2 (en) Information processing apparatus for recognizing user operation based on an image
US10126854B2 (en) Providing touch position information
JP2017062709A (en) Gesture operation device
US20160139762A1 (en) Aligning gaze and pointing directions
CN107179876B (en) Man-machine interaction device based on virtual reality system
Wolf et al. Tickle: a surface-independent interaction technique for grasp interfaces
US20210089161A1 (en) Virtual input devices for pressure sensitive surfaces
JP2015052819A (en) Electronic device
WO2019039065A1 (en) Information processing device, information processing method, and program
WO2017163566A1 (en) Program, computer device, program execution method, and system
CN106575184B (en) Information processing apparatus, information processing method, and computer readable medium
KR102325684B1 (en) Eye tracking input apparatus thar is attached to head and input method using this
JP6289655B2 (en) Screen operation apparatus and screen operation method
CN106155284B (en) Electronic equipment and information processing method
US9720513B2 (en) Apparatus and method for receiving a key input
KR101004671B1 (en) Network terminal device with space projection and space touch function and control method thereof
US11360632B2 (en) Information processing system
WO2015141529A1 (en) Input apparatus and input control program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18847415

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 18847415

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载