+

WO2019039065A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2019039065A1
WO2019039065A1 PCT/JP2018/023548 JP2018023548W WO2019039065A1 WO 2019039065 A1 WO2019039065 A1 WO 2019039065A1 JP 2018023548 W JP2018023548 W JP 2018023548W WO 2019039065 A1 WO2019039065 A1 WO 2019039065A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
control unit
state
information processing
finger
Prior art date
Application number
PCT/JP2018/023548
Other languages
English (en)
Japanese (ja)
Inventor
淳 入江
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019039065A1 publication Critical patent/WO2019039065A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • Patent Document 1 discloses an operation device that transmits an operation command based on an angle between fingers to a device when the user performs a gesture operation using a hand, that is, an input operation.
  • Patent Document 1 there is a case where an operation command not intended by the user is transmitted to the device. That is, there was a possibility that the input operation performed by the user could be erroneously recognized.
  • the present disclosure proposes a new and improved information processing apparatus, information processing method, and program that can more accurately determine an input operation performed by a user using a hand.
  • an information processing apparatus includes a control unit that determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range. .
  • an information processing method in which the processor determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
  • a program causes a computer to realize a control function of determining the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range. Be done.
  • control unit or the like determines the type of the input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
  • the control unit or the like determines the type of the input operation in consideration of the distance between the object and the operation site of the hand and the state of the hand. Therefore, the control unit or the like can more accurately determine the input operation of the user.
  • the above-mentioned effects are not necessarily limited, and, along with or in place of the above-mentioned effects, any of the effects shown in the present specification, or other effects that can be grasped from the present specification May be played.
  • the present inventor diligently studied a technique capable of more accurately determining the gesture operation performed by the user using a hand, that is, the input operation.
  • the inventor examined a technique for determining the type of input operation based on the distance between a real object and a hand operation part.
  • the real object is a portion where the user performs an input operation.
  • an image on which the user can perform an input operation can be displayed on the real object.
  • the operation site is a site where the user performs an input operation, and is, for example, a fingertip of the index finger.
  • the inventor examined a technique for determining that the input operation is a touch operation when the operation site of the hand contacts the real object. This is because it can be said that the influence of individual differences is small with regard to distance. That is, when the user wants to operate the operation target, it is assumed that the operation site of the hand approaches the operation target.
  • the present inventor has made further studies. As a result, the inventor has found that the user's input operation can be determined more accurately by focusing on the distance between the object and the operation part of the hand and the state of the hand and using these pieces of information. And the present inventors considered to this embodiment based on such knowledge. Hereinafter, the present embodiment will be described in detail.
  • the information processing apparatus 10 includes a storage unit 11, a display unit 12, a detection unit 13, a control unit 14, and a housing unit 15.
  • the information processing apparatus 10 is, for example, a projection apparatus.
  • the information processing apparatus 10 has a hardware configuration such as a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a non-volatile memory, a display device, a detection device, and a housing.
  • the ROM stores information necessary for the operation of the information processing apparatus 10, such as a program.
  • the CPU reads and executes the program stored in the ROM.
  • the display device projects an image on various real objects (for example, the table 100, the wall 200, and the like) to display the images on these real objects.
  • the detection device captures, for example, the user's hand and a real object.
  • a detection apparatus detects the state (for example, shape) and position of a hand of a user (person U shown in FIG.2 and FIG.3) based on a captured image. Furthermore, the detection device also detects the position, shape, etc. of the real object.
  • a detection apparatus is a depth sensor, for example, if it is an apparatus which can realize the above-mentioned function, it will not be limited to this.
  • the detection device may be a device capable of specifying the position of the three-dimensional feature point of the hand.
  • the housing incorporates each hardware configuration.
  • the storage unit 11, the display unit 12, the detection unit 13, the control unit 14, and the housing unit 15 are realized by these hardware configurations.
  • the information processing apparatus 10 may further include an operation button such as a power button. These operation buttons allow an input operation by the user.
  • the information processing device 10 may include a communication device or the like that can communicate with other information processing devices.
  • the storage unit 11 is configured by, for example, a ROM and stores various types of information.
  • the storage unit 11 stores an image displayed by the display unit 12, a program necessary for the processing of the information processing apparatus 10, and the like.
  • the display unit 12 includes, for example, a display device.
  • the display unit 12 displays an image on a real object by projecting the image on various real objects (for example, the table 100, the wall 200, and the like). In the example shown in FIGS. 2 and 3, the display unit 12 displays the image 12A on the real object.
  • the type of real object to be displayed by the display unit 12 is not particularly limited. Further, the type of image displayed by the display unit 12 is not particularly limited. For example, the display unit 12 may display an image to be subjected to an input operation by the user.
  • the detection unit 13 includes, for example, a detection device.
  • the detection unit 13 captures, for example, the user's hand and a real object. Then, the detection unit 13 detects the state and the position of the user's hand (for example, the right hand RH) based on the captured image.
  • the state of the hand includes, for example, the shape of the hand.
  • the position of the hand includes the position of the finger, the palm, and the like.
  • the detection unit 13 also detects the position, shape, and the like of the real object based on the captured image. Then, the detection unit 13 outputs detection information on the detection result to the control unit 14.
  • the control unit 14 is configured of, for example, a CPU (processor) or the like, and controls each component of the information processing apparatus 10. For example, the control unit 14 causes the display unit 12 to display various images. Furthermore, the control unit 14 recognizes the distance between the real object and the operation site of the hand based on the detection information.
  • the real object is an object on which an image is projected from the display unit 12 and is, for example, a table 100, a wall 200, and the like.
  • the operation site is a site where the user performs an input operation.
  • the control unit 14 includes one or more of any finger and palm included in the hand in the operation site. More specifically, the control unit 14 includes the index finger at the operation site. More specifically, the control unit 14 includes the fingertip of the index finger at the operation site. The fingertip is, for example, a site beyond the first joint. For example, the control unit 14 includes the fingertip RH21 of the forefinger RH2 of the right hand RH in the operation site.
  • the operation site is not limited to this, and may be the fingertip of another finger, palm, left hand or the like.
  • the operation site may be set in advance or may be set arbitrarily by the user. The modification of the operation part is mentioned later.
  • the control unit 14 determines the type of the input operation based on the state of the hand. For example, the control unit 14 determines whether the input operation is a touch operation based on the state of the hand. Thus, the control unit 14 can easily and accurately determine whether the user is performing a touch operation.
  • the touch operation is various operations performed by touching a real object, in other words, an image displayed on the real object. Examples of the touch operation include a selection operation (an operation of selecting a touched portion), a drawing operation (an operation of drawing a line on a movement locus by moving the touched operation portion), and the like. The specific determination method will be described later.
  • control part 14 performs processing according to input operation.
  • the housing unit 15 includes, for example, the above-described housing, and incorporates the hardware configuration, in other words, the storage unit 11, the display unit 12, the detection unit 13, and the control unit 14.
  • step S10 the display unit 12 displays an image on a real object.
  • the user brings the forefinger RH2 of the right hand RH close to the image, for example, to perform an input operation on the image.
  • the detection unit 13 captures an image of the user's hand and a real object (a real object on which an image is displayed).
  • the detection unit 13 captures an image of the user's right hand RH and a real object.
  • the detection unit 13 detects the state and the position of the user's hand based on the captured image. Furthermore, the detection unit 13 also detects the position and shape of the real object. Subsequently, the detection unit 13 outputs detection information on the detection result to the control unit 14.
  • the control unit 14 specifies the distance between the real object and the operation part of the hand based on the detection information.
  • the control unit 14 specifies the distance between the real object and the fingertip RH21.
  • the control unit 14 determines whether the distance between the real object and the operation part of the hand is within a predetermined range.
  • the predetermined range here is not particularly limited. However, in the process to be described later, the control unit 14 determines whether the input operation is a touch operation. Therefore, it is preferable that the predetermined range here be small to some extent. For example, the predetermined range may be about several centimeters. The predetermined range may be set in advance or may be set by the user.
  • control unit 14 determines that the distance between the real object and the operation site is within the predetermined range. If the control unit 14 determines that the distance between the real object and the operation site is within the predetermined range, the process proceeds to step S20. Thereafter, the control unit 14 determines the type of input operation based on the state of the hand. When the distance between the real object and the operation part of the hand exceeds the predetermined range, the control unit 14 ends this processing.
  • step S20 the control unit 14 determines the state of the hand based on the detection information.
  • the control unit 14 specifies the state of the right hand RH.
  • step S30 the control unit 14 determines whether the state of the hand is in the touch operation permission state.
  • the touch operation permitting state means the state of the hand when the user is permitted to perform the touch operation.
  • control unit 14 sets any finger included in the hand as a determination target finger.
  • the control unit 14 may include the thumb in the determination target finger.
  • control part 14 may judge that a state of a hand is a touch operation permission state, when a finger tip of a thumb is hiding in a palm.
  • the state where the fingertip of the thumb is hidden in the palm means that the fingertip of the thumb is present at a position facing the palm, for example.
  • the detection unit 13 captures an image of the hand from the back side of the hand, the thumb hides in the palm, so the thumb hardly appears in the captured image.
  • the detection unit 13 captures a hand from the palm side, the thumb present in the palm is captured.
  • the state of RH is the touch operation permission state. That is, the state shown in FIG. 5A is an example of the touch operation permission state.
  • the control unit 14 performs the touch operation on the right hand RH. It may be determined that it is not in the allowable state.
  • FIGS. 5A and 5B it can be said that these figures are different in the state of the thumb.
  • the thumb when the thumb is included in the determination target finger, whether or not the state of the hand is the touch operation permitted state depends on the state of the thumb.
  • the thumb tends to be short and thick compared to other fingers. And regarding this tendency, it is thought that there is little individual difference. Therefore, identification is often easier than with other fingers. Therefore, by including the thumb in the determination target finger, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
  • the control unit 14 determines whether the state of the hand is the touch operation permitted state based on whether the finger tip of the thumb is hidden by the palm. As shown in FIGS. 5A and 5B, the difference in shape between the case where the thumb's finger tip is hidden in the palm and the case where it is not so is clear. Therefore, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
  • control unit 14 determines that the state of the hand is in the touch operation permission state, the control unit 14 determines that the input operation performed by the user is the touch operation, and the process proceeds to step S40.
  • the control unit 14 determines that the state of the hand is not in the touch operation permission state, the control unit 14 determines that the input operation performed by the user is an operation other than the touch operation, and proceeds to step S50.
  • the control unit 14 determines the type of input operation based on the state of the hand.
  • step S40 the control unit 14 receives a touch operation by the user. That is, when the user performs a touch operation (input operation) using the operation part, the control unit 14 performs processing according to the touch operation. For example, as shown in FIG. 6A, when the user moves the fingertip R21 of the forefinger RH2 on the image 12A, the control unit 14 draws the line image 12B on the movement trajectory. Thereafter, the control unit 14 ends the present process.
  • step S50 the control unit 14 does not receive the touch operation.
  • the control unit 14 may receive an input operation other than the touch operation.
  • the control unit 14 may perform a process (for example, a page feed process or the like) according to the hover operation. Thereafter, the control unit 14 ends the present process.
  • the control unit 14 determines the type of the input operation based on the state of the hand when the distance between the real object and the operation part of the hand falls within a predetermined range. . As described above, the control unit 14 determines the type of the input operation in consideration of the distance between the real object and the operation site of the hand and the state of the hand. Therefore, the control unit 14 can more accurately determine the user's input operation. Specifically, the control unit 14 considers the distance between the real object and the operation part of the hand. Regarding the distance between the real object and the operation part of the hand, individual differences are considered to be small.
  • control unit 14 determines the type of input operation by the user based on the state of the hand. Therefore, even in such a case, the control unit 14 can more easily and accurately determine the user's input operation. As a result, the user can perform input operations more accurately.
  • control unit 14 determines that the state of the hand is in the touch operation permission state, it can determine that the input operation is the touch operation. Thereby, the control unit 14 can more easily and accurately determine whether the user has performed a touch operation.
  • control unit 14 includes one or more of any finger and palm included in the hand as the operation site. Therefore, the control unit 14 can more easily and accurately grasp whether or not the operation part approaches the real object.
  • control unit 14 includes an index finger at the operation site.
  • the control unit 14 determines the type of the input operation in consideration of the distance between the index finger and the real object. Therefore, the control unit 14 can more easily and accurately determine the input operation performed by the user.
  • control unit 14 determines any finger included in the hand as the determination target finger, and determines whether the state of the hand is the touch operation permission state based on the state of the determination target finger. Therefore, the control unit 14 can determine the state of the hand in more detail, and in turn, can more easily and accurately determine the type of the input operation.
  • control unit 14 includes the thumb in the determination target finger.
  • the thumb tends to be shorter and thicker than the other fingers. Such a tendency is considered to be small among individuals. For this reason, it is often easier to detect than other fingers. Therefore, by including the thumb in the determination target finger, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
  • control unit 14 determines that the state of the hand is the touch operation permitted state when the fingertip of the thumb is hidden by the palm. As mentioned above, the difference in shape between the case where the thumb's finger tip is hidden in the palm and the case where it is not so is clear. Therefore, the control unit 14 can more easily and accurately determine whether the state of the hand is the touch operation permission state.
  • control unit 14 performs control to display an image on a real object. Therefore, the user can more accurately perform the input operation using the image.
  • Second embodiment> Next, a second embodiment of the present disclosure will be described. First, the configuration of the information processing apparatus 20 according to the second embodiment will be described based on FIGS. 7 to 9B.
  • the information processing apparatus 20 includes a storage unit 21, a display unit 22, a detection unit 23, a control unit 24, a frame unit 25, and a housing unit 26.
  • the information processing device 20 is, for example, an eyewear type information processing device. Although the shape of the information processing apparatus 20 is a glasses shape in FIG. 8, it may be a goggle shape or the like.
  • the information processing apparatus 20 has a hardware configuration such as a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a non-volatile memory, a display device, a detection device, a frame, and a housing.
  • the ROM stores information necessary for the operation of the information processing apparatus 20, such as a program.
  • the CPU reads and executes the program stored in the ROM.
  • the display device is configured to correspond to the lens portion of the glasses and is a transmissive display device.
  • the display device displays various images on the display surface. The user can visually recognize the image displayed on the display device and the real object in the real space. That is, the displayed image is a so-called virtual object.
  • the display device may display various images (virtual objects) superimposed on a real object.
  • the detection device is similar to that of the first embodiment.
  • the frame includes a front portion that is a frame of the display device, a temple portion that is hung on the user's ear, and the like.
  • the housing incorporates each hardware configuration and is provided in the temple portion. The installation position of the housing is not limited to this example.
  • the storage unit 21, the display unit 22, the detection unit 23, the control unit 24, the frame unit 25, and the housing unit 26 are realized by these hardware configurations.
  • the information processing apparatus 20 may further include an operation button such as a power button. These operation buttons allow an input operation by the user.
  • the information processing device 20 may include a communication device or the like that can communicate with other information processing devices.
  • the storage unit 21 is configured by, for example, a ROM and stores various types of information.
  • the storage unit 21 stores an image displayed by the display unit 22, a program necessary for processing of the information processing apparatus 20, and the like.
  • the display unit 22 includes, for example, a display device.
  • the display unit 22 superimposes and displays an image on various real objects (for example, the left hand LH shown in FIGS. 9A and 9B). In the example illustrated in FIGS. 9A and 9B, the display unit 22 displays the image 22A superimposed on the real object (left hand LH).
  • the type of real object to be displayed by the display unit 22 is not particularly limited.
  • the display unit 22 may superimpose an image on various walls, signs, signs, and the like.
  • the type of image displayed by the display unit 22 is not particularly limited.
  • the display unit 22 may display an image to be subjected to an input operation by the user.
  • the detection unit 23 includes, for example, a detection device.
  • the function of the detection unit 23 is the same as that of the detection unit 13 of the first embodiment. That is, the detection unit 23 captures, for example, the user's hand and a real object.
  • a real object is an object on which an image is superimposed.
  • the detection unit 23 detects the state and the position of the user's hand (for example, the right hand RH) based on the captured image.
  • the state of the hand includes, for example, the shape of the hand.
  • the position of the hand includes the position of the finger, the palm, and the like.
  • the detection unit 23 also detects the position, the shape, and the like of the real object based on the captured image. Then, the detection unit 23 outputs detection information on the detection result to the control unit 24.
  • the control unit 24 is configured of, for example, a CPU (processor) or the like, and controls each component of the information processing apparatus 20.
  • the control unit 24 causes the display unit 22 to display various images.
  • the control unit 24 recognizes the distance between the real object and the operation site of the hand based on the detection information.
  • the real object is an object on which an image is superimposed, and is, for example, a user's hand or the like.
  • the operation site is a site where the user performs an input operation.
  • the specific content is the same as that of the first embodiment. That is, the control unit 24 includes one or more of any finger and palm included in the hand in the operation site. More specifically, the control unit 24 includes an index finger at the operation site. More specifically, the control unit 24 includes the fingertip of the index finger at the operation site. For example, the control unit 24 includes the fingertip RH21 of the forefinger RH2 of the right hand RH in the operation site.
  • the operation site is not limited to this, and may be the fingertip of another finger, palm, left hand or the like.
  • the operation site may be set in advance or may be set arbitrarily by the user. The modification of the operation part is mentioned later.
  • control unit 24 determines the type of the input operation based on the state of the hand when the distance between the real object and the operation site of the hand falls within a predetermined range. For example, the control unit 24 determines whether the input operation is a touch operation based on the state of the hand. Thus, the control unit 24 can easily and accurately determine whether the user is performing a touch operation. The specific determination method will be described later. And control part 24 performs processing according to input operation.
  • the frame unit 25 is configured of the above-described frame.
  • the housing unit 26 includes, for example, the above-described housing, and incorporates the hardware configuration, in other words, the storage unit 21, the display unit 22, the detection unit 23, and the control unit 24.
  • step S10 the display unit 22 superimposes and displays an image on a real object.
  • the user brings, for example, a forefinger RH2 of the right hand RH close to a real object on which the image is superimposed, in order to perform an input operation on the image.
  • the detection unit 23 captures an image of the user's hand and a real object (a real object on which an image is superimposed).
  • the detection unit 13 captures an image of the user's right hand RH and a real object.
  • the detection unit 23 detects the state and the position of the user's hand based on the captured image. Furthermore, the detection unit 23 also detects the position and the shape of the real object. Next, the detection unit 23 outputs detection information on the detection result to the control unit 24.
  • the control unit 24 specifies the distance between the real object and the operation part of the hand based on the detection information.
  • the control unit 14 specifies the distance between the real object and the fingertip RH21.
  • the control unit 24 determines whether the distance between the real object and the operation site of the hand is within a predetermined range.
  • the predetermined range here is not particularly limited. However, in processing to be described later, the control unit 24 determines whether the input operation is a touch operation. Therefore, it is preferable that the predetermined range here be small to some extent. For example, the predetermined range may be about several centimeters. The predetermined range may be set in advance or may be set by the user.
  • control unit 24 determines that the distance between the real object and the operation site is within the predetermined range. If the control unit 24 determines that the distance between the real object and the operation site is within the predetermined range, the process proceeds to step S20. Thereafter, the control unit 14 determines the type of input operation based on the state of the hand. When the distance between the real object and the operation part of the hand exceeds the predetermined range, the control unit 24 ends this processing.
  • step S20 the control unit 14 determines the state of the hand based on the detection information.
  • the control unit 24 specifies the state of the right hand RH.
  • step S30 the control unit 24 determines whether the state of the hand is in the touch operation permission state.
  • the touch operation permitting state means the state of the hand when the user is permitted to perform the touch operation.
  • control unit 24 sets any finger included in the hand as a determination target finger.
  • the control unit 24 may include the thumb in the determination target finger.
  • control part 24 may judge that a state of a hand is a touch operation permission state, when a finger tip of a thumb is hiding in a palm.
  • the state where the fingertip of the thumb is hidden in the palm means that the fingertip of the thumb is present at a position facing the palm, for example.
  • the detection unit 13 captures an image of the hand from the back side of the hand, the thumb hides in the palm, so the thumb hardly appears in the captured image.
  • the detection unit 13 captures a hand from the palm side, the thumb present in the palm is captured.
  • control unit 24 determines that the state of the hand is in the touch operation permission state, the control unit 24 determines that the input operation performed by the user is a touch operation, and the process proceeds to step S40.
  • the control unit 24 determines that the state of the hand is not in the touch operation permission state, the control unit 24 determines that the input operation performed by the user is an operation other than the touch operation, and proceeds to step S50.
  • the control unit 24 determines the type of input operation based on the state of the hand.
  • step S40 the control unit 24 receives a touch operation by the user. That is, when the user performs a touch operation using the operation site, the control unit 24 performs a process according to the touch operation. For example, as shown in FIG. 9A, when the user moves the fingertip R21 of the forefinger RH2 on the image 22A, the control unit 24 draws a line image 22B on the movement trajectory. Thereafter, the control unit 24 ends the present process.
  • step S50 the control unit 24 does not receive the touch operation.
  • the control unit 24 may receive an input operation other than the touch operation.
  • the control unit 24 may perform a process (for example, a page feed process or the like) according to the hover operation. Thereafter, the control unit 24 ends the present process.
  • the same effect as that of the first embodiment can be obtained. Furthermore, according to the second embodiment, even in the case of superimposing a virtual object on a real object, the input operation performed by the user can be determined more easily and accurately.
  • the virtual object is superimposed on the real object, and the distance between the real object and the operation site is measured.
  • the second embodiment is not limited to this, and for example, only virtual objects may be arranged in the real space. That is, the display unit 22 may display the virtual object at a position not overlapping the real object.
  • the control unit 24 sets the position of the virtual object in the real space. Then, the control unit 24 may perform the above-described processing using the distance between the setting position of the virtual object and the operation site. In this case, since the control unit 24 does not need to superimpose the virtual object on the real object, the control unit 24 can provide more various virtual objects to the user. Furthermore, the control unit 24 can accurately determine the input operation on these virtual objects.
  • the index finger RH2 is the operation site, and the thumb RH1 touches any one of the middle finger RH3, ring finger RH4 and little finger RH5 (in this example, middle finger RH3).
  • the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is the touch operation permission state.
  • the index finger RH2 is the operation site, and the thumb RH1 is separated from the middle finger RH3, the ring finger RH4, and the little finger RH5.
  • the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is not the touch operation permission state.
  • the control units 14 and 24 determine whether the state of the hand is the touch operation permitted state or not depending on whether the finger of the thumb touches the index finger. Good. Specifically, when the fingertip of the thumb touches the forefinger, the control units 14 and 24 may determine that the state of the hand is the touch operation permitted state. The control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the fingertip of the thumb does not touch the forefinger. Also in this case, the same effect as described above can be obtained. The same effects as those of the first and second embodiments can be obtained by the first modification as well.
  • the thumb is used as the determination target finger, but another finger may be used as the determination target finger.
  • the control units 14 and 24 may include one or more fingers among the middle finger, the ring finger, and the little finger in the determination target finger. Then, the control units 14 and 24 may determine whether or not the state of the hand is in the touch operation permission state based on whether the fingertip of the determination target finger is hidden in the palm. Specifically, when the fingertip of the determination target finger is hidden by the palm, the control units 14 and 24 may determine that the state of the hand is the touch operation permission state.
  • the control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the fingertip of the determination target finger is separated from the palm.
  • one or more of the middle finger RH3, the ring finger RH4, and the little finger RH5 are the determination target fingers.
  • the control units 14 and 24 determine that the state of the hand (the state of the right hand RH) is not the touch operation permission state .
  • the control unit 14 or 24 controls the state of the hand in the touch operation permitted state depending on whether the middle finger is touching the index finger. It may be determined whether or not. Specifically, when the middle finger touches the forefinger, the control units 14 and 24 may determine that the state of the hand is the touch operation permission state. The control units 14 and 24 may determine that the state of the hand is not the touch operation permission state when the middle finger does not touch the index finger. In the example shown in FIG. 12, the middle finger RH3 touches the forefinger RH2. In this case, the control units 14 and 24 may determine that the state of the hand (the state of the right hand RH) is the touch operation permission state.
  • the same effect as in the first and second embodiments can be obtained.
  • the thumb is used as the determination target finger, it may be possible to easily determine whether the state of the hand is the touch operation permitted state.
  • the index finger is used as the operation part, but another part may be used as the operation part.
  • the little finger RH5 of the right hand RH may be used as the operation site
  • the thumb RH1 of the right hand RH may be used as the operation site.
  • the control units 14 and 24 control the state of the hand (the state of the right hand RH) based on the state of the finger other than the operation site (thumb RH1 in the example of FIG. It may be determined whether or not the operation is permitted. Also in the third modification, the same effect as in the first and second embodiments can be obtained.
  • step S30 the control units 14 and 24 satisfy one or more of the conditions described in the first and second embodiments and the conditions described in the first to third modifications. In this case, it may be determined that the state of the hand is the touch operation permission state.
  • An information processing apparatus comprising: a control unit that determines the type of input operation based on the state of the hand when the distance between the object and the operation site of the hand falls within a predetermined range.
  • the control unit determines that the input operation is a touch operation when the control unit determines that the state of the hand is in a touch operation permission state.
  • the control unit includes, in the operation portion, any one or more of any finger and palm included in the hand.
  • the control unit includes an index finger in the operation portion.
  • the control unit determines any finger included in the hand as a determination target finger, and determines whether the state of the hand is the touch operation permitted state based on the state of the determination target finger.
  • the control unit determines that the state of the hand is the touch operation permitted state when the index finger is included in the operation portion and the fingertip of the thumb touches a finger other than the index finger. Information processor as described.
  • the control unit determines that the state of the hand is the touch operation permitted state when the index finger is included in the operation portion and the fingertip of the thumb touches the index finger.
  • the information processing apparatus according to any one of the above.
  • the information processing apparatus according to any one of (5) to (9), wherein the control unit includes, in the determination target finger, at least one finger of the middle finger, the ring finger, and the little finger.
  • the control unit determines that the state of the hand is the touch operation permission state when the fingertip of the determination target finger is hidden by a palm.
  • the control unit includes a forefinger in the operation portion, includes the middle finger in the determination target finger, and the touch operation is permitted when the middle finger touches the forefinger.
  • the information processing apparatus according to (10) or (11), which determines.
  • the control unit determines the type of the input operation based on the state of the hand, when the distance between the real object present in the real space and the operation portion falls within a predetermined range.
  • the information processing apparatus according to any one of (12).
  • the information processing apparatus according to (13), wherein the control unit performs control to display an image to be an object of the input operation on the real object.
  • the control unit determines whether the state of the hand is in the touch operation permission state when the virtual object is disposed in the real space and the distance between the virtual object and the operation site falls within a predetermined range.
  • the information processing apparatus according to any one of (1) to (15).
  • On the computer A program that implements a control function of determining the type of input operation based on the state of the hand when the distance between the object and the operation part of the hand falls within a predetermined range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Le problème de la présente invention est de fournir un nouveau dispositif de traitement d'informations amélioré, etc., permettant d'évaluer plus précisément une opération d'entrée effectuée par un utilisateur à l'aide d'une main. La solution de la présente invention porte sur un dispositif de traitement d'informations comprenant une unité de commande qui, lorsque la distance entre un objet et une partie d'une main effectuant une opération se trouve à l'intérieur d'une plage prescrite, évalue le type d'opération d'entrée en fonction de l'état de la main. Dans la présente invention, l'unité de commande, etc.,référence la distance entre un objet et une partie d'une main effectuant une opération et l'état de la main afin d'évaluer le type d'opération d'entrée.
PCT/JP2018/023548 2017-08-25 2018-06-21 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2019039065A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-162074 2017-08-25
JP2017162074 2017-08-25

Publications (1)

Publication Number Publication Date
WO2019039065A1 true WO2019039065A1 (fr) 2019-02-28

Family

ID=65440016

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/023548 WO2019039065A1 (fr) 2017-08-25 2018-06-21 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2019039065A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021075103A1 (fr) * 2019-10-17 2021-04-22 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004258714A (ja) * 2003-02-24 2004-09-16 Toshiba Corp 画像表示装置
JP2012068854A (ja) * 2010-09-22 2012-04-05 Shimane Prefecture 操作入力装置および操作判定方法並びにプログラム
US20150040040A1 (en) * 2013-08-05 2015-02-05 Alexandru Balan Two-hand interaction with natural user interface
US20150058782A1 (en) * 2013-08-21 2015-02-26 Intel Corporation System and method for creating and interacting with a surface display
JP2017073128A (ja) * 2015-10-08 2017-04-13 船井電機株式会社 空間入力装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004258714A (ja) * 2003-02-24 2004-09-16 Toshiba Corp 画像表示装置
JP2012068854A (ja) * 2010-09-22 2012-04-05 Shimane Prefecture 操作入力装置および操作判定方法並びにプログラム
US20150040040A1 (en) * 2013-08-05 2015-02-05 Alexandru Balan Two-hand interaction with natural user interface
US20150058782A1 (en) * 2013-08-21 2015-02-26 Intel Corporation System and method for creating and interacting with a surface display
JP2017073128A (ja) * 2015-10-08 2017-04-13 船井電機株式会社 空間入力装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021075103A1 (fr) * 2019-10-17 2021-04-22 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US12216830B2 (en) 2019-10-17 2025-02-04 Sony Group Corporation Information processing apparatus and information processing method

Similar Documents

Publication Publication Date Title
US20190250714A1 (en) Systems and methods for triggering actions based on touch-free gesture detection
JP2022118183A (ja) デジタルデバイスとの対話のための直接的なポインティング検出のためのシステムおよび方法
EP2817693B1 (fr) Dispositif de reconnaissance de gestes
EP2943836B1 (fr) Dispositif d'affichage facial réalisant un étalonnage du regard, et procédé de commande associé
US11360605B2 (en) Method and device for providing a touch-based user interface
CN105144072B (zh) 在多点触控装置上对压感进行模拟
TW201633066A (zh) 3d視覺化技術
US9916043B2 (en) Information processing apparatus for recognizing user operation based on an image
US10126854B2 (en) Providing touch position information
JP2017062709A (ja) ジェスチャー操作装置
US20160139762A1 (en) Aligning gaze and pointing directions
CN107179876B (zh) 基于虚拟现实系统的人机交互装置
Wolf et al. Tickle: a surface-independent interaction technique for grasp interfaces
US20210089161A1 (en) Virtual input devices for pressure sensitive surfaces
JP2015052819A (ja) 電子機器
WO2019039065A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2017163566A1 (fr) Programme, dispositif informatique, procédé d'exécution de programme, et système
CN106575184B (zh) 信息处理装置、信息处理方法及计算机可读介质
KR102325684B1 (ko) 두부 착용형 시선 추적 입력 장치 및 이를 이용하는 입력 방법
JP6289655B2 (ja) 画面操作装置及び画面操作方法
CN106155284B (zh) 电子设备及信息处理方法
US9720513B2 (en) Apparatus and method for receiving a key input
KR101004671B1 (ko) 공간 투영 및 공간 터치 기능이 구비된 네트워크 단말 장치및 그 제어 방법
US11360632B2 (en) Information processing system
WO2015141529A1 (fr) Appareil d'entrée et programme de commande d'entrée

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18847415

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 18847415

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载