WO2023032335A1 - Système d'assistance, procédé d'assistance, et programme - Google Patents
Système d'assistance, procédé d'assistance, et programme Download PDFInfo
- Publication number
- WO2023032335A1 WO2023032335A1 PCT/JP2022/016058 JP2022016058W WO2023032335A1 WO 2023032335 A1 WO2023032335 A1 WO 2023032335A1 JP 2022016058 W JP2022016058 W JP 2022016058W WO 2023032335 A1 WO2023032335 A1 WO 2023032335A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- concentration
- subject
- degree
- state
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 22
- 238000004364 calculation method Methods 0.000 claims abstract description 60
- 230000004071 biological effect Effects 0.000 claims abstract description 18
- 238000011156 evaluation Methods 0.000 claims description 32
- 238000010801 machine learning Methods 0.000 claims description 32
- 230000033001 locomotion Effects 0.000 claims description 27
- 230000008921 facial expression Effects 0.000 claims description 10
- 230000004886 head movement Effects 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 7
- 230000004424 eye movement Effects 0.000 claims description 7
- 230000003247 decreasing effect Effects 0.000 claims description 6
- 239000012141 concentrate Substances 0.000 claims description 2
- 238000004891 communication Methods 0.000 description 51
- 230000010365 information processing Effects 0.000 description 34
- 230000004048 modification Effects 0.000 description 34
- 238000012986 modification Methods 0.000 description 34
- 238000010586 diagram Methods 0.000 description 17
- 210000003128 head Anatomy 0.000 description 17
- 230000000694 effects Effects 0.000 description 8
- 230000007774 longterm Effects 0.000 description 6
- 239000000470 constituent Substances 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 230000006866 deterioration Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000001149 cognitive effect Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000037007 arousal Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000002542 deteriorative effect Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 230000011514 reflex Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 206010041349 Somnolence Diseases 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000036387 respiratory rate Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Definitions
- the present invention relates to a support system, support method, and program.
- Patent Document 1 a main light source that mainly emits white light and a single-wavelength light source are turned on, and the light output of the single-wavelength light source is controlled so that a desired arousal level is obtained.
- a lighting device is disclosed that can improve arousal levels.
- the present invention provides a support system, a support method, and a program that can support a user to grasp information about a target person, including the degree of concentration of the target person.
- a support system includes an acquisition unit that acquires a feature amount related to biological activity of a subject who is performing intellectual work, and based on the feature amount acquired by the acquisition unit, A computing unit that computes a degree of concentration on intellectual work; and a presenting unit that presents information about the subject including the degree of concentration computed by the computing unit to the user.
- a support method includes an acquisition step of acquiring a feature amount related to biological activity of a subject who is performing intellectual work; and a presenting step of presenting to the user information about the subject including the degree of concentration calculated in the calculating step.
- one aspect of the present invention can be implemented as a program for causing a computer to execute the above support method.
- it can be realized as a computer-readable recording medium storing the program.
- the support system, support method, and program of the present invention can support the user to grasp information about the target person, including the target person's degree of concentration.
- FIG. 1 is a diagram for explaining an overview of a support system according to an embodiment.
- FIG. 2 is a block diagram illustrating an example of the functional configuration of the support system according to the embodiment;
- FIG. 3 is a flow chart showing a first example of the operation of the support system according to the embodiment.
- FIG. 4 is a diagram showing an example of determination results of a subject's degree of concentration and state of concentration.
- FIG. 5 is a flow chart showing a second example of the operation of the support system according to the embodiment.
- FIG. 6 is a flow chart showing a third example of the operation of the support system according to the embodiment.
- FIG. 7 is a flow chart showing Modification 1 of the third example of the operation of the support system according to the embodiment.
- FIG. 1 is a diagram for explaining an overview of a support system according to an embodiment.
- FIG. 2 is a block diagram illustrating an example of the functional configuration of the support system according to the embodiment;
- FIG. 3 is a flow chart showing a first example of
- FIG. 8 is a flow chart showing Modification 2 of the third example of the operation of the support system according to the embodiment.
- FIG. 9 is a diagram showing an example of advice proposed to the subject.
- FIG. 10 is a diagram showing an example of information about the target person presented to the user.
- 11 is a block diagram illustrating an example of a functional configuration of a support system according to a modification of the embodiment;
- FIG. 12 is a diagram showing an example of the feature amount during concentration and the feature amount during non-concentration stored in the feature amount database.
- FIG. 13 is a diagram showing another example of non-concentration feature amounts stored in the feature amount database.
- each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code
- FIG. 1 is a diagram for explaining an overview of a support system 10 according to an embodiment.
- the support system 10 calculates the degree of concentration for the intellectual work of the subject U1 based on the feature amount related to the life activity of the subject U1 who is performing the intellectual work in the work space 4, and calculates the calculated degree of concentration. Presenting information about the subject U1, including to the user U2. Thereby, the support system 10 assists the user U2 in grasping information regarding the subject U1 who is performing intellectual work, such as the degree of concentration of the subject U1. For example, as shown in FIG. 1, the degree of concentration of the subject U1 may be presented to the user U2 by being displayed on the second terminal device 40 used by the user U2.
- the information about the subject U1 including the degree of concentration of the subject U1 is, for example, the degree of concentration of the subject U1, the state of concentration of the subject U1, support information for assisting the performance of the intellectual work of the subject U1, and the like. is. A specific description of the information will be given later.
- Intellectual work is work performed using the intelligence (in other words, brain) of subject U1, and includes, for example, cognition (in other words, recognition and understanding), thinking, judgment, logic, calculation, and creation. and so on.
- intelligence in other words, brain
- cognition in other words, recognition and understanding
- thinking judgment, logic, calculation, and creation.
- intellectual work may include work, study, or the like.
- concentration means working on things by allocating enough cognitive resources in the mind to the intellectual work target.
- a “concentration state” is a state in which cognitive resources are allocated to an intellectual work target, and a “concentration degree” means the degree.
- the “degree of concentration” means the degree to which cognitive resources are concentrated on the target of intellectual work when working on something.
- the degree of concentration affects the amount of intellectual work per unit time (that is, the amount of intellectual work). That is, the higher the degree of concentration, the greater the amount of intellectual work per unit time, and the lower the degree of concentration, the smaller the amount of intellectual work per unit time.
- the feature values related to the bioactivity of the subject U1 who is performing intellectual work include, for example, the posture, hand movement, head movement, eye movement, facial expression, and handwriting of the subject U1 who is performing intellectual work.
- Information indicating at least one state of the feature amount may be included in data acquired by a non-contact device, which will be described later, or may be included in data acquired by a contact device.
- the target person U1 is a student or worker, and the user U2 is a teacher of the student or a manager of the worker.
- the support system 10 calculates the concentration levels of a plurality of target people U1, and collects information about the target person U1 including the concentration levels. may be presented to the user U2.
- the work space 4 is a space in which the subject U1 performs intellectual work. ) space.
- a plurality of subjects U1 may be located in one work space 4 , or each of the plurality of subjects U1 may be located in a different work space 4 .
- the work desk is provided with, for example, a non-contact device such as a camera 1 and a human sensor 2, and a first terminal device 20 such as a personal computer or a tablet terminal.
- a non-contact device such as a camera 1 and a human sensor 2
- a first terminal device 20 such as a personal computer or a tablet terminal.
- the camera 1 may be provided in the first terminal device 20 .
- the support system 10 may include, for example, a server device (for example, the information processing device 30 in FIG. 2).
- the server device (for example, the information processing device 30) may be installed in the work space 4 or a building that includes the work space 4 (for example, an office building, a home, a cram school, or a school), or may be installed in a cloud server. may be installed outside the building.
- the support system 10 may be realized by the second terminal device 40 used by the user U2.
- FIG. 2 is a block diagram showing an example of the functional configuration of the support system 10 according to the embodiment.
- the support system 10 includes, for example, two or more first terminal devices 20, an information processing device 30, and a second terminal device 40.
- Each of the two or more first terminal devices 20 is communicably connected to the camera 1 and the human sensor 2, for example.
- the camera 1 captures an image of the subject U1.
- the camera 1 is installed at a position capable of photographing at least the head of the subject U1 substantially from the front.
- the camera 1 may be, for example, installed on a work desk, or may be a camera mounted on the first terminal device 20 used by the subject U1.
- the camera 1 may photograph the subject U1 using visible light, or may photograph the subject U1 using infrared light.
- the imaging range of the camera 1 is not limited to the head of the subject U1, and may include the upper body of the subject U1. Images captured by the camera 1 are transmitted to the first terminal device 20 .
- Communication between the camera 1 and the first terminal device 20 may be wired communication or wireless communication, and the communication standard is not particularly limited.
- the human sensor 2 detects the presence or absence of the target person U1.
- a detection result detected by the human sensor 2 is transmitted to the first terminal device 20 .
- Communication between the human sensor 2 and the first terminal device 20 may be wired communication or wireless communication, and the communication standard is not particularly limited.
- the motion sensor 2 was exemplified as a sensor for detecting the presence or absence of the target person U1.
- a hardware button may be used, or a button displayed on a touch panel display provided in the first terminal device 20 may be used. Alternatively, it may be a seating sensor that detects that the subject U1 is sitting on a chair. Note that the configuration for detecting the presence or absence of the target person U1 does not necessarily have to be provided.
- the first terminal device 20 acquires the image transmitted from the camera 1 and transmits the acquired image to the information processing device 30 . At this time, the first terminal device 20 may transmit the acquired image to the information processing device 30 based on the detection result transmitted from the human sensor 2 when the target person U1 is present. In addition, the first terminal device 20 acquires the image transmitted from the camera 1 and the information (so-called detection result) regarding whether or not the target person U1 is transmitted from the human sensor 2, and The obtained image and the detection result may be transmitted to the information processing device 30 .
- the first terminal device 20 may include a reception unit (not shown) that receives an operation input from the subject U1.
- the subject U1 may use buttons (for example, presence button, leave button, etc.), the presence or absence of the target person U1 may be input.
- the first terminal device 20 receives the content of the intellectual work of the subject U1, the tools used for performing the intellectual work, and the method for performing the intellectual work output from the information processing device 30. Advice on at least one of them may be presented to the subject U1.
- the content of the intellectual work may be, for example, the subjects and units that the subject U1 is studying.
- a tool for performing intellectual work may be, for example, a learning material, a learning video, or an application.
- the method for performing intellectual work may be, for example, time allocation for each subject, time allocation for each teaching material, or the order of learning them.
- the information processing device 30 acquires an image of the subject U1 captured by the camera 1, calculates the degree of concentration of the subject U1 based on the feature amount included in the acquired image, and calculates the calculated degree of concentration. is output to the second terminal device 40 of the user U2, the information on the target person U1 including the degree of concentration of the target person U1 is presented to the user U2.
- the information processing device 30 includes a communication section 31 , a control section 32 and a storage section 33 .
- the communication unit 31 is a communication circuit (communication module) for the information processing device 30 to communicate with the first terminal device 20 and the second terminal device 40 .
- the communication unit 31 may include a communication circuit for communicating via the wide area communication network and a communication circuit for communicating via the local communication network.
- the communication unit 31 is, for example, a wireless communication circuit that performs wireless communication, but may be a wired communication circuit that performs wired communication. Note that the communication standard for communication performed by the communication unit 31 is not particularly limited.
- control unit 32 performs various information processing based on the image acquired from the first terminal device 20.
- the control unit 32 includes an acquisition unit 32a, a calculation unit 32b, a determination unit 32c, an evaluation unit 32d, and an output unit 32e.
- the functions of the acquisition unit 32a, the calculation unit 32b, the determination unit 32c, the evaluation unit 32d, and the output unit 32e are obtained by executing a computer program stored in the storage unit 33 by the processor or microcomputer constituting the control unit 32. Realized. Details of functions of the acquisition unit 32a, the calculation unit 32b, the determination unit 32c, the evaluation unit 32d, and the output unit 32e will be described later in operation examples.
- the storage unit 33 is a storage device that stores a dedicated application program and the like for the control unit 32 to execute.
- a machine learning model 34 and a database 35 may be stored in the storage unit 33 .
- the machine learning model 34 is used for concentration calculation processing. For example, when an image of the subject U1 is input, the machine learning model 34 outputs the degree of concentration of the subject U1 from the feature amount included in the image.
- Machine learning model 34 is, for example, a neural network (NN). More specifically, the machine learning model 34 may have convolutional layers, such as a convolutional neural network (CNN).
- CNN convolutional neural network
- the example of the machine learning model 34 described above is merely an example, and is not limited to this.
- the machine learning model 34 may be learned using teacher data, for example.
- the teacher data may be, for example, a data set including a set of an image of a target person performing intellectual work as input data and a degree of concentration of the target person on the intellectual work as output data.
- the machine learning model 34 is a trained machine learning model and includes learned parameters adjusted by machine learning.
- the machine learning model 34 may be generated by a model generation unit (not shown), or may be generated by an external device such as a cloud server.
- the model generation unit is realized by executing a program stored in the storage unit 33 by the processor.
- the database 35 stores the identification information of the subject U1 and the past degree of concentration of the subject U1 (in other words, the history of the degree of concentration of the subject U1) in association with each other.
- the second terminal device 40 is an information terminal such as a personal computer, a tablet terminal, or a smart phone used by the user U2.
- the second terminal device 40 includes, for example, a communication unit 41 , a control unit 42 , a storage unit 43 , a reception unit 44 and a presentation unit 45 .
- the communication unit 41 is a communication circuit (communication module) for the second terminal device 40 to communicate with the information processing device 30 .
- the communication unit 41 may include a communication circuit for communicating via the wide area communication network and a communication circuit for communicating via the local communication network.
- the communication unit 41 is, for example, a wireless communication circuit that performs wireless communication, but may be a wired communication circuit that performs wired communication. Note that the communication standard for communication performed by the communication unit 41 is not particularly limited.
- the control unit 42 performs various information processing related to the second terminal device 40 based on the input operation accepted by the accepting unit 44 .
- the control unit 42 is implemented by, for example, a microcomputer, but may be implemented by a processor.
- the storage unit 43 is a storage device that stores control programs and the like executed by the control unit 42 .
- the storage unit 43 is implemented by, for example, a semiconductor memory.
- the reception unit 44 receives the input operation of the user U2.
- the reception unit 44 is implemented by, for example, a touch panel, but may be implemented by a mouse, keyboard, hardware buttons, microphone, or the like.
- the presentation unit 45 presents information about the target person U1 including the degree of concentration of the target person U1 output by the information processing device 30 .
- the presentation unit 45 is, for example, a display device that displays image information including characters and symbols.
- the presentation unit 45 may include an audio output device that outputs audio information.
- the display device is, for example, a display including a liquid crystal (LC) panel or an organic EL (Electro Luminescence) panel as a display device.
- the audio output device is, for example, a speaker.
- the presentation unit 45 may display image information of information about the degree of concentration of the target person U1 on a display device, may output audio information of information about the degree of concentration of the target person U1 from an audio output device, Both image information and audio information may be presented.
- FIG. 3 is a flow chart showing a first example of the operation of the support system 10 according to the embodiment.
- the first terminal device 20 when the subject U1 touches a button displayed on the touch panel display of the first terminal device 20 when starting intellectual work, the first terminal device 20 causes the camera 1 and the human sensor 2 to start sensing. (not shown). Upon receiving the instruction signal, the camera 1 and the human sensor 2 start sensing, and transmit sensing data (here, image data and data indicating the presence or absence of a person) to the first terminal device 20 (unnecessary). shown).
- sensing data here, image data and data indicating the presence or absence of a person
- the first terminal device 20 transmits an image of the target person U1 to the information processing device 30 based on the sensing data acquired from the camera 1 and the motion sensor 2 (not shown). Specifically, the first terminal device 20 converts the image captured by the camera 1 when the human sensor 2 detects the presence of a person (here, the target person U1) into an image of the target person U1 (more specifically, is transmitted to the information processing device 30 as an image showing the head of the subject U1).
- the acquisition unit 32 a of the information processing device 30 periodically acquires the image of the target person U1 transmitted from the first terminal device 20 via the communication unit 31 .
- the image of the subject U1 includes at least the head of the subject U1 photographed substantially from the front.
- the image of the head of the subject U1 includes the movement of the head of the subject U1, the state of the eyes of the subject U1 (inner corner of the eye, iris, corneal reflex, or pupil), or the facial expression of the subject U1. It contains information such as It can be said that these pieces of information are so-called feature amounts related to the life activity of the subject U1.
- the acquiring unit 32a acquires the feature amount related to the biological activity of the subject U1 who is performing intellectual work (S11).
- the acquisition unit 32a acquires an image (feature amount) when the subject U1 is present. and information indicating whether or not U1 exists.
- the calculation unit 32b calculates the degree of concentration of the subject U1 on intellectual work based on the image of the subject U1 acquired by the acquisition unit 32a (S12).
- the image of the subject U1 includes feature amounts related to the life activity of the subject U1. That is, the calculation unit 32b calculates the degree of concentration on the intellectual work of the subject U1 based on the feature amount acquired by the acquisition unit.
- the calculation unit 32b may calculate the degree of concentration each time the acquisition unit 32a acquires the feature amount, or may calculate the degree of concentration each time the acquisition unit 32a acquires a change (difference) in the feature amount. good too.
- the calculation unit 32b extracts one or more feature amounts by appropriately performing image processing on the image of the subject U1 acquired by the acquisition unit 32a. Then, the calculation unit 32b calculates a score related to the degree of concentration for each feature amount by comparing each extracted feature amount with the corresponding template image. The score may be calculated according to the degree of matching with the corresponding template image, for example. Then, the calculation unit 32b adds the points calculated for each feature amount, and calculates the degree of concentration based on the added points. The calculation unit 32b may weight each feature amount when adding the points.
- the calculation unit 32b calculates the image when the target person U1 exists based on the information (in other words, the image when the target person U1 exists). ) may be extracted, and the extracted image (in other words, the feature amount included in the image) may be used to calculate the degree of concentration.
- the output unit 32e outputs information about the target person U1 including the degree of concentration of the target person U1 calculated by the calculation unit 32b in step S12 (not shown).
- Information about the subject U1 including the degree of concentration output by the output unit 32 e is transmitted to the second terminal device 40 via the communication unit 31 .
- the control unit 42 of the second terminal device 40 acquires the information regarding the target person U1 including the degree of concentration transmitted from the information processing device 30 via the communication unit 41
- the control unit 42 of the second terminal device 40 acquires By causing the presentation unit 45 to present the information, the information regarding the target person U1 including the degree of concentration of the target person U1 is presented to the user (S13).
- the degree of concentration (more specifically, time-series data of the degree of concentration) presented by the presentation unit 45 may be, for example, a graph as shown in FIG.
- the presentation unit 45 may present the concentration state of the subject U1 together with the degree of concentration (specifically, time-series data of the degree of concentration).
- Information about the subject U1 includes, for example, the degree of concentration, the state of concentration, the persistence of the state of concentration, and support information for supporting the performance of the intellectual work of the subject U1 (for example, the performance of the intellectual work). advice on efficiency improvement, etc.).
- the support information is, for example, advice for each of the subject U1 and the user U2. This includes advice on support.
- These advices are, for example, contents of the subject U1's intellectual work, tools used for performing the intellectual work, and methods for performing the intellectual work, depending on the concentration state of the subject U1. may be advice for at least one of
- FIG. 4 is a diagram showing an example of determination results of the degree of concentration and concentration state of the subject U1.
- the degree of concentration is expressed as a percentage so that the maximum value is 1.0 and the minimum value is 0.0, but the maximum value is 100% and the minimum value is 0%. It may be expressed as a percentage as shown in the figure, or as a rank such as level 1 or level 2.
- the degree of concentration of one target person U1 is calculated, and the degree of concentration of the one target person U1 (for example, a graph of time-series data of the degree of concentration) is used as information about the target person U1 by the user U2.
- the degree of concentration may be similarly calculated for each of the plurality of subjects U1 and presented to the user U2.
- the acquisition unit 32a of the information processing device 30 may acquire images (more specifically, feature amounts included in the images) of the plurality of subjects U1 from the plurality of first terminal devices 20.
- the calculation unit 32b may calculate the degree of concentration of each of the plurality of subjects based on the plurality of feature amounts.
- the presentation unit 45 may present information regarding each of the plurality of subjects U1, including the degree of concentration of each of the plurality of subjects U1, to the user U2.
- FIG. 5 is a flow chart showing a second example of the operation of the support system 10 according to the embodiment.
- steps that are the same as those shown in FIG. 3 are given the same step numbers.
- the presentation unit 45 presented the user U2 with the degree of concentration of the target person U1 calculated by the calculation unit 32b as information about the target person U1.
- the second example differs from the first example in that the concentration state of the subject U1 is determined based on the change in the degree of concentration calculated by the calculation unit 32b, and the determined concentration state is presented to the user U2.
- the description of the same operations as in the first example will be omitted or simplified, and the differences from the first example will be mainly described.
- the acquisition unit 32a of the information processing device 30 acquires the image of the subject U1.
- the acquiring unit 32a acquires the feature amount related to the biological activity of the subject U1 who is performing intellectual work (S11).
- the calculation unit 32b calculates the degree of concentration of the subject U1 on intellectual work based on the image of the subject U1 acquired by the acquisition unit 32a (S12).
- the determination unit 32c determines the concentration state of the subject U1 based on the change in the degree of concentration calculated by the calculation unit 32b in step S12 (S21). Specifically, the determination unit 32c determines the concentration state of the subject U1 based on the threshold. For example, as shown in FIG. 4, the first threshold (Th1) of the degree of concentration is set to 0.5, and the second threshold (Th2) is set to 0.6.
- the determination unit 32c determines that the concentration state of the subject U1 is good (for example, is presented as ⁇ or good), the degree of concentration exceeds the first threshold, And if it is less than the second threshold, the concentration state of the subject U1 is determined to be normal (for example, ⁇ or OK is presented), and if the degree of concentration is less than the first threshold, the concentration state of the subject U1 is Determined as bad (for example, presented as x or not acceptable). Since the meaning of the concentration state has been described above, the explanation is omitted here.
- the presentation unit 45 presents the concentration state determined by the determination unit 32c to the user (S22). For example, as shown in FIG. 4, the presentation unit 45 may superimpose a graph of the degree of concentration on the state of concentration and display them on the display. At this time, the presentation unit 45 may output audio information indicating the determination result of the concentration state.
- the concentration state of the subject U1 is determined based on the change in concentration level, and the determined concentration state of the subject U1 is presented to the user U2. This is different from the second example in that, when it is determined that the concentration state of the subject U1 has deteriorated, the fact that the concentration state has deteriorated is presented to the user.
- the determination unit 32c may determine the concentration state of the subject U1 based on whether or not the degree of concentration has decreased by exceeding a predetermined rate. Specifically, for example, the determination unit 32c may determine that the concentration state of the subject U1 has deteriorated when the degree of concentration has decreased by exceeding a predetermined rate. At this time, in step S13, the presentation unit 45 of the second terminal device 40 presents to the user U2 that the concentration state of the target person U1 has deteriorated.
- Modification 2 of the second example differs from Modification 1 of the second example in the determination criteria for determining that the concentration state of the subject U1 has deteriorated.
- the determination unit 32c determines the concentration state of the subject U1 based on whether or not the degree of concentration has remained below a threshold value (eg, the first threshold value in FIG. 4) for a certain period of time or longer (eg, 5 minutes or longer). may be determined. Specifically, for example, the determination unit 32c may determine that the concentration state of the subject U1 has deteriorated when the degree of concentration has remained below a threshold value (for example, a first threshold value) for a certain period of time or longer. At this time, in step S13, the presentation unit 45 of the second terminal device 40 presents to the user U2 that the concentration state of the target person U1 has deteriorated.
- a threshold value eg, the first threshold value in FIG. 4
- the determination unit 32c may determine that the concentration state of the subject U1 has deteriorated when the degree of concentration has remained below a threshold value (for example, a first threshold value) for a certain period of time or longer.
- the determination unit 32c observes the degree of concentration and the state of concentration of the subject U1 over a medium- to long-term period of, for example, one week to one month, and determines the degree of concentration and the state of concentration unique to the subject U1. (in other words, patterns) may be determined. Further, the determination unit 32c determines a threshold (the above constant time) can be set. Accordingly, the determination unit 32c can accurately determine that the concentration state of the subject person U1 has deteriorated and present it to the user U2. As a result, the user U2 can grasp the modulation of the degree of concentration of the target person U1 and the sign of the modulation based on the medium- to long-term data.
- Modification 3 of the second example differs from Modifications 1 and 2 in the determination criteria for determining that the concentration state of the subject U1 has deteriorated.
- the determination unit 32c determines whether the number of times the concentration level falls below a threshold value (eg, the first threshold value in FIG. 4) within a certain period of time (eg, 30 minutes) exceeds a predetermined number of times (eg, twice). Based on this, the concentration state of the subject U1 may be determined. Specifically, for example, the determining unit 32c determines that the number of times the degree of concentration falls below a threshold value (eg, first threshold value) within a certain period of time (eg, 30 minutes) exceeds a predetermined number of times (eg, 2 times), It may be determined that the concentration state of the subject U1 has deteriorated. At this time, in step S13, the presentation unit 45 of the second terminal device 40 presents to the user U2 that the concentration state of the target person U1 has deteriorated.
- a threshold value eg, the first threshold value in FIG. 4
- a predetermined number of times eg, twice
- the determination unit 32c observes the degree of concentration and the state of concentration of the subject U1 over a medium- to long-term period of, for example, one week to one month, and determines the degree of concentration and the state of concentration unique to the subject U1. (in other words, patterns) may be determined. Furthermore, the determining unit 32c determines a threshold (the above-mentioned predetermined number of times) may be set. Accordingly, the determination unit 32c can accurately determine that the concentration state of the subject person U1 has deteriorated and present it to the user U2. As a result, the user U2 can grasp the modulation of the degree of concentration of the target person U1 and the sign of the modulation based on the medium- to long-term data.
- Modification 4 of Second Example In Modification 4 of the second example, unlike Modifications 1 to 3 described above, criteria for determining whether or not the target person U1 is in a state of being able to maintain concentration will be described.
- the determination unit 32c may determine whether or not the subject U1 is in a state of being able to maintain concentration, based on the time interval at which the degree of concentration falls below a threshold (eg, the first threshold in FIG. 4). . Specifically, for example, if the time interval at which the degree of concentration falls below a threshold (eg, the first threshold) is shorter than a predetermined interval (eg, 10 minutes), the determination unit 32c determines that the subject U1 maintains the state of concentration. It may be determined that it is not possible. At this time, in step S13, the presentation unit 45 of the second terminal device 40 presents to the user U2 that the target person U1 is not in a state of being able to maintain a concentrated state.
- a threshold eg, the first threshold in FIG. 4
- the target person U1 is not in a state of being able to maintain a concentrated state, which means that the target person U1 is in a state of concentrating on intellectual work, but is unable to maintain the state of concentration, and falls into a state of non-concentration. It means that the possibility is high.
- the determination result is sent to the user U2.
- the presentation enables the user U2 to support the target person U1 before the target person U1's concentration state deteriorates.
- the determination unit 32c observes the degree of concentration and the state of concentration of the subject U1 over a medium- to long-term period of, for example, one week to one month, and determines the degree of concentration and the state of concentration unique to the subject U1. (in other words, patterns) may be determined. Further, the determination unit 32c determines the threshold value (the predetermined interval) of the time interval when the degree of concentration of the subject U1 falls below the threshold (for example, the first threshold) according to the characteristics of the degree of concentration and the state of concentration unique to the subject U1. may be set. Accordingly, the determination unit 32c can accurately determine that the concentration state of the subject person U1 has deteriorated and present it to the user U2. As a result, the user U2 can grasp the modulation of the degree of concentration of the target person U1 and the sign of the modulation based on the medium- to long-term data.
- the threshold value the predetermined interval
- the threshold for example, the first threshold
- FIG. 6 is a flow chart showing a third example of the operation of the support system 10 according to the embodiment.
- the concentration state of the target person U1 is determined based on the change in the degree of concentration calculated by the calculation unit 32b, and the determined concentration state is presented to the user U2.
- the third example differs from the second example in that the relative concentration state is evaluated by comparing the degree of concentration calculated by the calculation unit 32b for each of the plurality of subjects U1.
- the description of the same operations as in the second example will be omitted or simplified, and the differences from the second example will be mainly described.
- the acquisition unit 32a of the information processing device 30 acquires the images of the plurality of subjects U1 transmitted from the plurality of first terminal devices 20. In other words, the acquisition unit 32a acquires the feature amount related to life activity for each of the plurality of subjects U1 who are performing intellectual work (S31).
- the calculation unit 32b calculates the degree of concentration of each of the plurality of subjects U1 based on the feature amounts of the plurality of subjects U1 acquired by the acquisition unit 32a in step S31 (S32).
- the evaluation unit 32d evaluates the relative concentration state of each of the plurality of subjects U1 by comparing the degrees of concentration of the plurality of subjects U1 calculated by the calculation unit 32b in step S32 (S33). .
- the output unit 32e outputs the relative concentration state (hereinafter also referred to as the evaluation result of S33) of each of the plurality of subjects U1 evaluated by the evaluation unit 32d in step S33 (not shown).
- the evaluation result output by the output unit 32 e is transmitted to the second terminal device 40 via the communication unit 31 .
- the control unit 42 of the second terminal device 40 acquires the evaluation result transmitted from the information processing device 30 via the communication unit 41, it causes the presentation unit 45 to present the acquired evaluation result of S33. That is, the presentation unit 45 presents the evaluation result of S33 (more specifically, the relative concentration state of each of the plurality of subjects U1) to the user (S34).
- Modification 1 of the third example In the third example, by comparing the degree of concentration of the plurality of subjects U1, the relative concentration state of each of the plurality of subjects U1 is evaluated and presented to the user U2. In Modified Example 1, furthermore, by comparing the rate of decrease in the degree of concentration of a plurality of subjects U1, if there is a degree of concentration in which the rate of decrease exceeds a reference value, the subject corresponding to the degree of concentration is assigned to user U2. Present.
- FIG. 7 is a flowchart showing Modification 1 of the third example of the operation of support system 10 according to the embodiment.
- steps that are the same as those shown in FIG. 6 are given the same step numbers.
- the points different from the third example will be mainly described.
- the evaluation unit 32d further compares the rate of decrease in the degree of concentration of a plurality of subjects U1 to determine whether the rate of decrease exceeds the reference value. It is determined whether or not there is a degree of concentration (S41).
- the evaluation unit 32d determines that there is a degree of concentration in which the rate of decrease exceeds a reference value (for example, the average value or the median value of the rate of decrease in a plurality of degrees of concentration) (Yes in S41)
- the target corresponding to the degree of concentration The person U1 is evaluated as a subject with a relatively high rate of concentration decrease (S42).
- the presentation unit 45 presents the evaluation results of steps S33 and S42 to the user U2 (S43).
- the presentation unit 45 presents the evaluation result of step S33 to the user U2 (S34).
- FIG. 8 is a flow chart showing Modification 2 of the third example of the operation of support system 10 according to the embodiment.
- steps that are the same as those shown in FIGS. 6 and 7 are given the same step numbers.
- the third example and the differences from the first modification of the third example will be mainly described.
- the evaluation unit 32d further sets the degree of concentration of the plurality of subjects U1 to a reference value (for example, the average value or median value of the plurality of degrees of concentration). , it is determined whether or not there is a degree of concentration that has remained below the reference value for a certain period of time or longer (for example, 5 minutes or longer) (S51). If the evaluation unit 32d determines that there is a degree of concentration that has been below a reference value (for example, the average value or median value of a plurality of degrees of concentration) for a certain period of time or longer (Yes in S51), The subject U1 is evaluated as having a relatively low degree of concentration (S52). Next, the presentation unit 45 presents the evaluation results of steps S33 and S52 to the user U2 (S53).
- a reference value for example, the average value or median value of the plurality of degrees of concentration
- the presentation unit 45 presents the evaluation result of step S33 to the user U2 (S34 ).
- the concentration state of the subject U1 is determined based on the past data of the subject U1 stored in the database 35 .
- the target person U1 will be described, but the concentration state of a plurality of target people U1 may be determined by the same method. Differences from the first example and the second example will be described below with reference to FIG.
- the determination unit 32c compares the degree of concentration of the subject U1 calculated in step S12 of FIG. A concentration state of the person U1 is determined (not shown). Then, the presentation unit 45 may present the user U2 with the concentration state of the subject U1 output by the output unit 32e.
- the database 35 stores, for example, at least the identification information (for example, ID number) of the subject U1 and the past degree of concentration of the subject U1 in association with each other. As shown in FIG. 4, the degree of concentration is time-series data of the degree of concentration.
- the determination unit 32c determines the representative value (e.g., average value or median value) in the time-series data of the degree of concentration of the subject U1, and the time-series data of the past degree of concentration of the subject U1 read from the database 35. is greater than a positive threshold value (eg, 0.2). Then, when the difference value exceeds the positive threshold, the determination unit 32c determines that the concentration state of the subject U1 is improving. On the other hand, when the difference value does not exceed (that is, falls below) the positive threshold value, the determination unit 32c determines whether or not the difference value falls below a negative threshold value (eg, ⁇ 0.2).
- a positive threshold value e.g, 0.2
- the determination unit 32c determines that the concentration state of the subject U1 has deteriorated when the difference value is less than the negative threshold. On the other hand, the determining unit 32c determines that the concentration state of the subject U1 is stable (that is, there is no significant change) when the difference value does not fall below (that is, exceeds) the negative threshold.
- [Modification 1 of the fourth example] by comparing the degree of concentration of the subject U1 calculated by the calculation unit 32b with the degree of concentration of the subject U1 in the past stored in the database 35, the state of concentration of the subject U1 is calculated.
- the determined concentration state of the subject U1 is presented to the user U2, but in the modified example 1 of the second example, when it is determined that the concentration state of the subject U1 has deteriorated, the concentration state deteriorates. This differs from the fourth example in that what has been done is presented to the user.
- the database 35 further stores the past concentration state of the subject U1 in association with it.
- the determination unit 32c derives a reference value of the degree of concentration in the typical state of concentration of the subject U1 based on the past degree of concentration and the past state of concentration of the subject U1 stored in the database 35. may More specifically, the determination unit 32c may derive the average value or the median value of the time-series data of the degree of concentration in a typical state of concentration as the reference value.
- the determination unit 32c may determine the concentration state of the subject U1, for example, based on whether the degree of concentration of the subject U1 calculated by the calculation unit 32b is below the reference value. At this time, the determination unit 32c may determine that the concentration state of the subject U1 has deteriorated when the degree of concentration of the subject U1 is below the reference value. At this time, the presentation unit 45 of the second terminal device 40 presents to the user U2 that the concentration state of the target person U1 has deteriorated.
- Modification 2 of the fourth example differs from Modification 1 of the fourth example in the determination criteria for the concentration state of the subject U1.
- deterioration of the concentration state of the subject U1 is determined based on numerical values. This differs from Modification 1 in that the concentration state of the subject U1 is determined.
- the database 35 further stores the time-series data of the past degree of concentration of the subject U1 and the concentration pattern corresponding to the time-series data of the past degree of concentration in association with each other.
- the determination unit 32c may derive a typical concentration pattern of the subject U1 based on the past time-series data of the degree of concentration stored in the database 35 and the concentration pattern.
- a typical concentration pattern of the subject U1 is, for example, type I (constantly high degree of concentration), type II (constantly low degree of concentration), type III (gradual transition from high to low concentration), and type IV. It may be classified as one of (waves in the degree of concentration).
- the determination unit 32c derives the concentration pattern of the subject U1 based on the time series data of the degree of concentration of the subject U1 calculated by the calculation unit 32b, and the derived concentration pattern of the subject U1 and the concentration pattern of the subject U1
- the concentration state of the subject U1 is determined by comparing with the typical concentration pattern of . More specifically, for example, when the typical concentration pattern of the subject U1 is type I (constantly high degree of concentration), the concentration pattern of the subject U1 is type II (constantly low degree of concentration), type In the case of III (gradual transition from high level to low level) or type IV (waves in concentration level), it is determined that the concentration state of subject U1 is deteriorating. Further, for example, when the typical concentration pattern of the subject U1 is type IV, the determining unit 32c determines that the concentration state of the subject U1 is improving when the concentration pattern of the subject U1 is type I. I judge.
- the output unit 32e When the output unit 32e acquires the determination result of the concentration state of the subject U1 determined by the determination unit 32c, the output unit 32e outputs the acquired determination result of the concentration state to the second terminal device 40 (not shown).
- the presentation unit 45 of the second terminal device 40 presents the concentration state of the target person U1 to the user U2.
- the calculation unit 32b performs information processing based on a rule-based algorithm, and calculates the degree of concentration from the image of the subject U1 (more specifically, the feature amount included in the image). It is not limited to this.
- the calculator 32b may use the machine learning model 34 to calculate the degree of concentration from the image of the subject U1 (feature amount included in the image). Since the machine learning model 34 has been described above, the description thereof is omitted here.
- the determining unit 32c further determines the content of the intellectual work of the subject U1, the tool used for performing the intellectual work, and the Advice for at least one of the methods for performing intellectual work may be determined, and the advice may be offered to at least one of the subject U1 and the user U2.
- the advice is advice for improving the efficiency of intellectual work, and is included in the support information for supporting the target person U1 in performing the intellectual work.
- FIG. 9 is a diagram showing an example of advice proposed to the subject U1.
- the support system 10 may present advice to the first terminal device 20 used by the subject U1 according to the concentration state of the subject U1.
- the support system 10 may present advice on the contents of the intellectual work to the first terminal device 20 when the determination unit 32c determines that the concentration state of the subject U1 is deteriorating.
- the first terminal device 20 receives an instruction to present advice from the information processing device 30, an advice to switch the study subject to another subject is displayed on the display unit (display). good too.
- the subject U1 may input an instruction to switch the study subject to mathematics according to the advice, or if the subject U1 does not switch the study subject according to the advice, touch the "skip" button so that the study subject is not switched. You can enter instructions.
- the support system 10 may present to the user U2 that the study subject of the subject U1 has been switched.
- FIG. 10 is a diagram showing an example of information about the target person U1 proposed to the user U2.
- the support system 10 presents information about the subject U1 (for example, Mr. S) to the second terminal device 40 when the subject U1 switches the learning subject to another subject. good.
- the information about the subject U1 is that the concentration state of the subject U1 is declining, the graph of the time-series data of the degree of concentration, and the switching of the learning subject.
- the reception unit 44 terminates the presentation. accept instructions to
- the support system 10 includes the acquisition unit 32a that acquires the feature amount related to the life activity of the subject U1 who is performing intellectual work, and the feature amount of the subject U1 based on the feature amount acquired by the acquisition unit 32a. and a presentation unit 45 for presenting the user U2 with information about the target person U1 including the concentration level of the target person U1 calculated by the calculation unit 32b.
- the subject U1 is a student or a worker
- the user U2 is a teacher of the student or a manager of the worker.
- Such a support system 10 supports the user U2 by presenting information about the target person U1 including the concentration level of the target person U1 so that the user U1 can grasp the concentration level of the target person U1. can be done. Thereby, the user U2 can grasp whether or not the subject U1 is concentrating on the intellectual work.
- the support system 10 further includes a determination unit 32c that determines the concentration state of the subject U1 based on changes in the degree of concentration calculated by the calculation unit 32b.
- the user U2 is presented with the concentration state of the target person U1.
- Such a support system 10 can support the user U2 to grasp the concentration state of the target person U1. Thereby, the user U2 can grasp whether or not the target person U1 is concentrating on the intellectual work.
- the determination unit 32c determines the concentration state of the subject U1 based on whether the degree of concentration has decreased by exceeding a predetermined rate, and the presentation unit 45 determines whether the concentration degree , the user U2 is presented with the fact that the concentration state of the subject U1 has deteriorated.
- Such a support system 10 presents the fact to the user U2 when the concentration state of the target person U1 has deteriorated, so the user U2 can sequentially grasp the deterioration of the concentration state of the target person U1.
- the determination unit 32c determines the concentration state of the subject U1 based on whether or not the degree of concentration has remained below a threshold value (for example, the first threshold value in FIG. 4) for a certain period of time or longer.
- the presentation unit 45 presents to the user U2 that the concentration state of the subject person U1 has deteriorated when the degree of concentration has remained below the threshold value (first threshold value) for a certain period of time or longer.
- the determination unit 32c determines the concentration state of the subject U1 based on whether the number of times the degree of concentration falls below a threshold (first threshold) within a certain period of time exceeds a predetermined number of times. If the number of times the degree of concentration falls below the threshold (first threshold) within a certain period of time exceeds a predetermined number of times, the presentation unit 45 presents to the user U2 that the concentration state of the subject U1 has deteriorated.
- the determination unit 32c determines whether the target person U1 can maintain a state of concentration based on the time interval when the degree of concentration calculated by the calculation unit 32b falls below the threshold value (first threshold value). If the time interval at which the degree of concentration falls below the threshold value (first threshold value) is shorter than a predetermined interval, the presentation unit 45 notifies the user U2 that the target person U1 is not in a state where the concentration state can be maintained. Present.
- the support system 10 can determine that the subject U1 is not in a state of being able to maintain the state of concentration. can grasp.
- the support system 10 further includes a database 35 in which the identification information of the target person U1 and the past degree of concentration of the target person U1 are linked and stored.
- the concentration level of the subject U1 is determined by comparing the degree of concentration of the subject U1 calculated by and the past degree of concentration of the subject U1 stored in the database 35 .
- Such a support system 10 determines the current concentration state of the subject U1 by referring to the database 35 storing the past degree of concentration of the subject U1 and comparing it with the past degree of concentration of the subject U1. can do. Thereby, the user U2 can grasp whether the target person U1 is able to concentrate on intellectual work more than usual.
- the database 35 further stores the past concentration state of the subject U1 in a linked manner. and the reference value of the degree of concentration in the typical state of concentration of the subject U1, and based on whether the degree of concentration calculated by the calculation unit 32b is below the reference value, the degree of concentration of the subject U1 If the degree of concentration is below the reference value, the presentation unit 45 presents to the user U2 that the concentration state of the subject U1 has deteriorated.
- Such a support system 10 uses, as a threshold, a reference value of the degree of concentration in a typical state of concentration of the subject U1, which is derived based on past data such as the degree of concentration and the state of concentration of the subject U1. It is possible to determine the concentration state of the person U1. Therefore, since the support system 10 can determine the concentration state according to the characteristics of the subject U1, the concentration state of the subject U1 can be determined more accurately.
- the degree of concentration is time-series data of the degree of concentration
- the database 35 further stores time-series data of the degree of concentration in the past of the subject U1 and time-series data of the degree of concentration in the past.
- the determination unit 32c determines a typical concentration pattern of the subject U1 based on the time-series data of the past degree of concentration stored in the database 35 and the concentration pattern. Then, the concentration pattern of the subject U1 is derived based on the time-series data of the degree of concentration calculated by the calculation unit 32b, and the derived concentration pattern of the subject U1 and the typical concentration pattern of the subject U1 are calculated. The concentration state of the subject U1 is determined by the comparison.
- Such a support system 10 can determine the concentration state of the subject U1 by comparing the concentration pattern of the subject U1 with the typical concentration pattern of the subject U1. Further, for example, the support system 10 may derive a typical concentration pattern of the subject U1 according to the physical condition of the subject U1, the season, or the content of the intellectual work that the subject U1 is working on. In this case, the support system 10 can accurately determine the concentration state of the subject U1 depending on which typical concentration pattern the subject U1's concentration pattern resembles.
- the acquisition unit 32a further obtains information about the content of the intellectual work performed by the subject U1, information about the progress of the intellectual work by the subject U1, and information about the surroundings of the subject U1.
- the determination unit 32c further determines the relationship between the degree of concentration calculated by the calculation unit 32b and the information obtained by the acquisition unit 32a.
- Such a support system 10 considers the relationship between the degree of concentration of the subject U1 calculated by the calculation unit 32b and the information acquired by the acquisition unit 32a, and multilaterally determines the concentration state of the subject U1. can do.
- the acquisition unit 32a further obtains information about the content of the intellectual work performed by the subject U1, information about the progress of the intellectual work by the subject U1, and information about the surroundings of the subject U1. and the determination unit 32c further determines the relationship between the concentration state of the subject U1 and the information acquired by the acquisition unit 32a.
- Such support system 10 can determine the relationship between the concentration state of the subject U1 and the information acquired by the acquisition unit 32a, it is possible to determine the concentration state of the subject U1 from multiple perspectives. .
- the acquiring unit 32a acquires the feature amount related to the bioactivity of the subject U1 for each of the plurality of subjects U1, and the calculating unit 32b
- the presenting unit 45 calculates the degree of concentration of each of the plurality of subjects U1 based on the feature amount of the plurality of subjects U1 including the degree of concentration of each of the plurality of subjects U1 calculated by the calculation unit 32b. is presented to the user U2.
- Such a support system 10 can support the user U2 so that he/she can grasp the degree of concentration of each of the plurality of subjects U1. Thereby, the user U2 can grasp whether or not each subject U1 is concentrating on the intellectual work with respect to the plurality of subjects U1.
- the support system 10 further compares the degree of concentration of the plurality of subjects U1 calculated by the calculation section 32b, thereby evaluating the relative concentration state of each of the plurality of subjects U1. , and the presentation unit 45 presents to the user U2 the relative concentration state of each of the plurality of subjects U1 evaluated by the evaluation unit 32d.
- Such a support system 10 can support the user U2 to grasp the relative concentration state of each of the plurality of subjects U1. Thereby, the user U2 can grasp the target person U1 who has a high priority of guidance or education based on the relative concentration state.
- the evaluation unit 32d further compares the reduction rates of the concentration levels of the plurality of subjects U1.
- the presentation unit 45 further evaluates the target person U1 corresponding to the degree of concentration as the target person U1 whose rate of decrease in the degree of concentration is relatively large. A large subject U1 is presented to the user U2.
- Such a support system 10 can support the user U2 so that he/she can grasp the target person U1 whose degree of concentration decrease rate is relatively large among the plurality of target persons U1. Thereby, the user U2 can grasp the target person U1 whose degree of concentration decrease rate is relatively large among the plurality of target persons U1.
- the evaluation unit 32d further compares the concentration levels of the plurality of subjects U1 with a reference value, and if there is a concentration level that has been below the reference value for a certain period of time or more, Then, the subject U1 corresponding to the degree of concentration is evaluated as the subject U1 with the relatively low degree of concentration, and the presentation unit 45 further evaluates the subject U1 with the relatively low degree of concentration evaluated by the evaluation unit 32d.
- the evaluation unit 32d further compares the concentration levels of the plurality of subjects U1 with a reference value, and if there is a concentration level that has been below the reference value for a certain period of time or more, Then, the subject U1 corresponding to the degree of concentration is evaluated as the subject U1 with the relatively low degree of concentration, and the presentation unit 45 further evaluates the subject U1 with the relatively low degree of concentration evaluated by the evaluation unit 32d.
- Such a support system 10 can support the user U2 so that he/she can grasp the target person U1 whose degree of concentration is relatively low among the plurality of target persons U1. Thereby, the user U2 can grasp the target person U1 whose degree of concentration is relatively low among the plurality of target people U1.
- the reference value is the average value or median value of a plurality of degrees of concentration.
- Such a support system 10 uses the average value or the median value of the concentration levels of the plurality of subjects U1 as a reference value in determining the concentration levels of the plurality of subjects U1. Concentration can be relatively evaluated. Thereby, the user U2 can relatively grasp the degree of concentration and the state of concentration of the plurality of subjects U1. Therefore, the user U2 can easily grasp the target person U1 who needs support from among the plurality of target people U1.
- the support system 10a further includes a feature amount database 36 that stores the feature amount of the focused state and the feature amount of the non-concentrated state, and the determination unit 32c acquires The presentation unit 45 determines whether the subject U1 is in a concentrated state or a non-concentrated state from the feature amount acquired by the unit 32a. In this case, it is presented to the user U2 that the subject U1 is in a non-concentrated state.
- Such a support system 10a is based on the feature amount of the concentration state and the feature amount of the non-concentration state stored in the feature amount database 36, and based on the feature amount related to the biological activity of the subject U1 who is performing intellectual work. It can be determined whether the person U1 is in a concentrated state or in a non-concentrated state. Therefore, the support system 10a can efficiently determine whether the target person U1 is in a concentrated state or in a non-concentrated state.
- the feature amount is information indicating at least one state of the subject U1's posture, hand movement, head movement, eye movement, facial expression, and handwriting.
- Such a support system 10 indicates at least one state of the target person U1's posture, hand movement, head movement, eye movement, facial expression, and handwriting as a feature quantity related to the biological activity of the target person U1. use the information; Therefore, the support system 10 can calculate the degree of concentration and determine the state of concentration of the subject U1 based on at least one piece of information, so that highly reliable calculation results and determination results can be obtained.
- the feature amount is included in the data acquired by the contactless device.
- Such a support system 10 can, for example, use a non-contact device such as a camera to acquire feature amounts related to the biological activities of the subject U1, so that it is difficult for the subject U1 to perform intellectual work. Therefore, the support system 10 can more accurately calculate the degree of concentration of the target person U1.
- the acquisition unit 32a further acquires information indicating whether or not the subject U1 exists, and the calculation unit 32b, based on the information acquired by the acquisition unit 32a, The feature amount when the target person U1 is present is used to calculate the degree of concentration.
- Such a support system 10 uses a sensor for detecting the presence or absence of a person, such as the human sensor 2, so that the feature amount related to the biological activity of the subject U1 when the subject U1 is present is calculated as the degree of concentration. can be used for calculations. Accordingly, the support system 10 can more accurately calculate the degree of concentration of the target person U1.
- the calculator 32b uses the machine learning model 34 to calculate the degree of concentration from the feature amount.
- assistance system 10 may further include a model generator (not shown in FIG. 2) that generates machine learning model 34 .
- such a support system 10 can more quickly and accurately calculate the degree of concentration of the subject U1.
- the support method executed by a computer such as the support system 10 includes an acquisition step (S11) of acquiring a feature amount related to the biological activity of the subject U1 who is performing intellectual work, and an acquisition step (S11).
- Such a support method can support the user U2 by presenting information about the target person U1 including the concentration level of the target person U1 so that the user U1 can grasp the concentration level of the target person U1. can. Thereby, the user U2 can grasp whether or not the subject U1 is concentrating on the intellectual work.
- the program causes the computer to execute the above support method.
- the user U2 is supported by presenting information about the target person U1 including the concentration level of the target person U1 so that the user U1 can grasp the concentration level of the target person U1. can be done. Thereby, the user U2 can grasp whether or not the subject U1 is concentrating on the intellectual work.
- FIG. 11 is a block diagram illustrating an example of a functional configuration of a support system according to a modification of the embodiment.
- the support system 10a further determines whether the subject U1 is in a focused state or a non-concentrated state from the feature amount acquired by the acquisition unit 32a based on the feature amount database 36. It differs from the support system 10 according to the embodiment.
- the support system 10a further includes a feature amount database 36 in addition to the configuration of the support system 10 according to the embodiment. Based on the feature amount database 36, the support system 10a determines whether the subject U1 is in a concentrated state or not from the feature amounts related to the biological activity of the subject U1 who is performing intellectual work in the work space 4 (see FIG. 1). It may be determined whether or not it is in the state, and the determination result may be presented to the user U2. For example, the support system 10a may present to the user U2 that the target person U1 is in a non-concentrated state when the determination unit 32c determines that the target person U1 is in a non-concentrated state.
- the support system 10a includes two or more first terminal devices 20, an information processing device 30a, and a second terminal device 40.
- the support system 10a differs from the support system 10 in that it includes an information processing device 30a.
- differences from the support system 10 according to the embodiment will be mainly described, and redundant descriptions will be omitted or simplified.
- the information processing device 30a acquires an image of the subject U1 captured by the camera 1, and based on the feature amount database 36, determines whether the subject U1 is in a concentrated state based on the feature amount included in the acquired image. It determines whether or not the user is in a non-concentrated state, and outputs information about the subject U1 to the second terminal device 40 of the user U2 based on the determination result.
- the information processing device 30a includes a communication unit 31, a control unit 32, a storage unit 33, and a model generation unit 37. Details of the function of each component included in the control unit 32 will be described later in an operation example.
- the feature database 36 is stored in the storage unit 33, for example.
- the feature amount database 36 stores, for example, the feature amounts of the concentration state and the feature amounts of the non-concentration state of a plurality of subjects U1.
- the feature amount is, for example, information indicating at least one state of the subject U1's posture, hand movement, head movement, eye movement, facial expression, and handwriting.
- the model generation unit 37 generates the machine learning model 34.
- the model generation unit 37 may generate the machine learning model 34 by learning the relationship between the feature amount and the degree of concentration. Since the learning of the machine learning model 34 has been described in the above embodiment, description thereof will be omitted here.
- the model generation unit 37 stores the generated machine learning model 34 in the storage unit 33 .
- the first terminal device 20 when the subject U1 touches a button displayed on the touch panel display of the first terminal device 20 when starting intellectual work, the first terminal device 20 causes the camera 1 and the human sensor 2 to start sensing. to send an instruction signal. Upon receiving the instruction signal, the camera 1 and the human sensor 2 start sensing and transmit sensing data (for example, image data and data indicating the presence or absence of a person) to the first terminal device 20 .
- sensing data for example, image data and data indicating the presence or absence of a person
- the first terminal device 20 transmits an image of the target person U1 to the information processing device 30a based on the sensing data acquired from the camera 1 and the motion sensor 2. Specifically, the first terminal device 20 converts the image captured by the camera 1 when the human sensor 2 detects the presence of a person (here, the target person U1) into an image of the target person U1 (more specifically, is transmitted to the information processing device 30a as an image in which the head of the subject U1 is reflected).
- the acquisition unit 32a of the information processing device 30a periodically acquires the image of the target person U1 transmitted from the first terminal device 20 via the communication unit 31.
- the image of the subject U1 includes at least the head of the subject U1 photographed substantially from the front.
- the image of the head of the subject U1 includes the movement of the head of the subject U1, the state of the eyes of the subject U1 (inner corner of the eye, iris, corneal reflex, or pupil), or the facial expression of the subject U1. It contains information such as It can be said that these pieces of information are so-called feature amounts related to the life activity of the subject U1.
- the acquiring unit 32a acquires the feature amount related to the biological activity of the subject U1 who is performing intellectual work.
- the acquisition unit 32a acquires an image (feature amount) when the subject U1 exists, the image captured by the camera 1 and the subject U1 detected by the human sensor 2 exist. You may acquire the information which shows whether it is.
- the determination unit 32c determines whether the subject U1 is in a concentrated state or a non-concentrated state from the feature amount acquired by the acquisition unit 32a. More specifically, for example, the determination unit 32c determines whether the feature amount acquired by the acquisition unit 32a matches the feature amount during concentration or during non-concentration stored in the feature amount database 36. It may be determined whether the person U1 is in a concentrated state or a non-concentrated state. For example, the determining unit 32c determines whether the subject U1 is in a focused state or a non-concentrated state based on the feature amount database 36 in the storage unit 33, based on the feature amount acquired by the acquisition unit 32a.
- the determination unit 32c determines whether the feature amount of the subject U1 acquired by the acquisition unit 32a (for example, the feature amount included in the image) matches the feature amount during concentration stored in the feature amount database 36. In this case, it may be determined that the subject U1 is in a state of concentration, and if the feature amount matches the non-concentration state, it may be determined that the subject U1 is in a state of non-concentration. Matching is not limited to complete matching.
- FIG. 12 is a diagram showing an example of the feature amount during concentration and the feature amount during non-concentration stored in the feature amount database 36.
- the feature amount is, for example, a feature amount included in an image, and may be information indicating the posture, hand movement, head movement, and facial expression of a person performing intellectual work.
- the image may be a still image or a moving image.
- FIGS. 12(a) and 12(b) show examples of feature amounts when the person in the image is concentrating on intellectual work.
- the determination unit 32c determines that the feature amount acquired by the acquisition unit 32a matches either of (a) or (b) of FIG. 12, the determination unit 32c determines that the subject U1 is in a concentrated state. .
- the presentation unit 45 may present to the user U2 that the target person U1 is in a concentrated state.
- FIGS. 12(c) and 12(d) are examples of feature amounts in a state in which the degree of concentration of a person performing intellectual work has decreased (so-called empty state).
- FIG. 12 shows an example of the feature quantity in a state in which the degree of concentration of the person is so low that it is impossible to perform intellectual work (so-called dozing state).
- the determination unit 32c determines that the feature amount acquired by the acquisition unit 32a matches any of (c) to (e) of FIG. 12, it determines that the subject U1 is in a non-concentrated state. do.
- the presentation unit 45 may present to the user U2 that the target person U1 is in a non-concentrated state.
- FIG. 13 is a diagram showing another example of the non-convergence feature amount stored in the feature amount database 36.
- the feature quantity shown in FIG. 13 is handwriting, for example.
- FIG. 13 shows an example of handwriting when the subject U1 is feeling sleepy
- (b) and (c) of FIG. An example of handwriting is shown
- (d) of FIG. 13 shows an example of handwriting in a state where the subject U1 is dozing off.
- the feature amount database 36 may also store the handwriting of the subject U1 when concentrating.
- the determination unit 32c determines that the feature amount (here, the handwriting of the subject U1) acquired by the acquisition unit 32a matches the feature amount (handwriting) during concentration stored in the feature amount database 36, the target It is determined that person U1 is in a concentrated state.
- the determination unit 32c determines that the feature amount (handwriting of the subject U1) acquired by the acquisition unit 32a matches the feature amount (handwriting) during non-concentration stored in the feature amount database 36, the target It is determined that the person U1 is in a non-concentrated state.
- the presentation unit 45 may present the determination result to the user U2.
- the feature amount during concentration and the feature amount during non-concentration stored in the feature amount database 36 are not limited to the above examples.
- the feature amount database 36 may store so-called motions such as body movements and hand movements during non-concentration as feature amounts during non-concentration.
- people's movements during non-concentration include the actions of operating a smartphone (e.g. exchanging messages, playing games, watching videos, etc.), drawing illustrations in a notebook, and turning a writing instrument with a finger. It may be a motion unrelated to intellectual work, such as a motion, a motion of rocking the body back and forth, whistling, or humming.
- the acquisition unit 32a may acquire the movement of the hand of the subject U1 appearing in the image acquired by the camera 1 as a feature amount.
- the determination unit 32c determines whether the subject U1 is in a concentrated state or a non-concentrated state based on the hand movements during concentration and non-concentration stored in the feature amount database 36. good too.
- Modification 1 of the first example In the first example, the feature amount is included in the image, but in Modification 1 of the first example, when the feature amount is included in sensing data other than the image, the feature amount acquired by the acquisition unit 32a is the feature amount. If the feature value for the time of concentration stored in the database 36 is deviated, it is determined that the subject U1 is in a non-concentration state.
- the support system 10a may include an infrared sensor, a distance sensor, etc. in addition to the camera 1 and the human sensor 2.
- the range sensor may be a depth-capable camera.
- the distance sensor may capture a distance image in which at least the subject's U1 head is captured, but may acquire a distance image in which the subject's U1 upper body is captured.
- the distance sensor may be placed beside the camera 1 shown in FIG. 1, or it may be placed on the right or left side of the desk.
- the determination unit 32c may determine that the subject U1 is in a state of non-concentration based on the thresholds of the feature amounts during concentration and non-concentration stored in the feature amount database 36 .
- infrared sensors may be installed on the left and right sides of the desk shown in FIG.
- an emitting part that emits infrared rays to one of the right and left sides of the desk and a light receiving part that receives the infrared rays emitted from the emitting part may be installed to the other of the right and left sides of the desk.
- the infrared sensor is installed at the position of the infrared sensor emitting part and the light receiving part (from the desk height, distance from subject U1, etc.) may be determined.
- the infrared sensor stores the The determination unit 32c may determine that the subject U1 is in a non-concentration state based on the stored threshold value (a certain period of time) of the feature amount during concentration and non-concentration.
- the acquisition unit 32a may acquire the movement of the subject U1's hand appearing in the image acquired by the camera 1 as a feature amount.
- the determination unit 32c determines whether the target person U1 is in a concentrated state or not, based on threshold values (for example, stationary time) of hand movements during concentration and non-concentration stored in the feature amount database 36. state may be determined.
- the support system 10a further includes the feature amount database 36 that stores the feature amount of the focused state and the feature amount of the non-concentrated state. , determines whether the subject U1 is in a concentrated state or a non-concentrated state from the feature amount acquired by the acquisition unit 32a, and the presentation unit 45 determines that the subject U1 is in a non-concentrated state by the determination unit 32c. If so, it is presented to the user U2 that the subject U1 is in a non-concentrated state.
- Such a support system 10a is based on the feature amount of the concentration state and the feature amount of the non-concentration state stored in the feature amount database 36, and based on the feature amount related to the biological activity of the subject U1 who is performing intellectual work. It can be determined whether the person U1 is in a concentrated state or in a non-concentrated state. Therefore, the support system 10a can efficiently determine whether the subject U1 is in a concentrated state or in a non-concentrated state.
- the feature amount is information indicating at least one state of the subject U1's posture, hand movement, head movement, eye movement, facial expression, and handwriting.
- Such a support system 10a indicates at least one state of the target person U1's posture, hand movement, head movement, eye movement, facial expression, and handwriting as a feature amount related to the biological activity of the target person U1. use the information; Therefore, the support system 10a can calculate the degree of concentration and determine the state of concentration of the subject U1 based on at least one piece of information, so that highly reliable calculation results and determination results can be obtained.
- the calculation unit 32b calculates the degree of concentration based on the image of the subject U1 acquired by the acquisition unit 32a, but is not limited to this aspect.
- the calculator 32b may calculate the degree of concentration based on vital information such as heartbeat, pulse, respiratory rate, or electroencephalogram of the subject U1.
- the acquiring unit 32a instead of acquiring an image from the camera 1, acquires vital information from a sensor that detects vital information. Further, the calculation unit 32b may acquire the voice of the subject U1 (for example, a sigh, an utterance, etc.) and calculate the degree of concentration. In this case, the acquiring unit 32a acquires sound information picked up by a microphone, for example.
- the calculation unit 32b may calculate the degree of concentration of the subject U1 in a complex manner based on the above vital information, sound information, etc., in addition to the image information.
- the machine learning model 34 may be composed of one machine learning model, or may be composed of two or more machine learning models. When configured with two or more machine learning models, the information input to each machine learning model may be the same, or may be different. Each machine learning model is learned, for example, in the same manner as the learning method described above.
- the calculation unit 32b may calculate the degree of concentration of the subject U1 based on the degree of concentration output from each machine learning model. For example, the calculation unit 32b may calculate the degree of concentration of the subject U1 based on table information in which combinations of degrees of concentration output from each machine learning model are associated with the degree of concentration of the subject U1. good.
- the machine learning model 34 may be an AI (Artificial Intelligence) model.
- the determination unit 32c and the evaluation unit 32d may determine or evaluate the state of concentration by referring not only to the degree of concentration calculated by the calculation unit 32b, but also to other parameters.
- the determination unit 32c and the evaluation unit 32d determine parameters such as the type of intellectual work performed by the subject U1, the time zone in which the subject U1 performs the intellectual work, or the place in which the subject U1 performs the intellectual work. You may refer further.
- the feature amount related to the biological activity of the subject U1 acquired by the acquisition unit 32a is mainly information related to the head of the subject U1, but is not limited to this.
- the feature amount may include information related to parts other than the head of the subject U1, such as movement of the shoulders of the subject U1.
- the camera 1 and the human sensor 2 are not included in the support systems 10 and 10a, but at least one of the camera 1 and the human sensor 2 is included in the support system 10. may be
- the support systems 10 and 10a are implemented by a plurality of devices, but may be implemented by a single device.
- the components provided in the support systems 10 and 10a may be distributed to the plurality of devices in any way.
- the components included in the server device (for example, the information processing device 30) in the above embodiments may be included in an information terminal installed in a closed space. That is, the present invention may be realized by cloud computing or by edge computing.
- the communication method between devices in the above embodiment is not particularly limited.
- a relay device (not shown) may intervene in communication between devices.
- each component may be realized by executing a software program suitable for each component.
- Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
- each component may be realized by hardware.
- each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
- the present invention may be realized as a support method executed by a computer such as the support systems 10 and 10a, or as a program for causing a computer to execute such a support method. It may be implemented as a computer-readable non-temporary recording medium on which a program is recorded.
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Tourism & Hospitality (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Development Economics (AREA)
- Game Theory and Decision Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Système d'assistance (10) comprenant : une unité d'acquisition (32a) qui acquiert une quantité de caractéristiques se rapportant à une activité biologique d'un sujet (U1) effectuant un travail intellectuel ; une unité de calcul (32b) qui calcule le degré de concentration du sujet (U1) sur le travail intellectuel sur la base de la quantité de caractéristiques acquise par l'unité d'acquisition (32a) ; et une unité de présentation (45) qui présente à un utilisateur (U2) des informations concernant le sujet (U1) comportant le degré de concentration calculé par l'unité de calcul (32b).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021140123 | 2021-08-30 | ||
JP2021-140123 | 2021-08-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023032335A1 true WO2023032335A1 (fr) | 2023-03-09 |
Family
ID=85412051
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/016058 WO2023032335A1 (fr) | 2021-08-30 | 2022-03-30 | Système d'assistance, procédé d'assistance, et programme |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023032335A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060064037A1 (en) * | 2004-09-22 | 2006-03-23 | Shalon Ventures Research, Llc | Systems and methods for monitoring and modifying behavior |
JP2010204926A (ja) * | 2009-03-03 | 2010-09-16 | Softbank Bb Corp | モニタリングシステム、モニタリング方法、およびプログラム |
JP2015223224A (ja) * | 2014-05-26 | 2015-12-14 | パナソニックIpマネジメント株式会社 | 集中度の評価装置、プログラム |
JP2018140162A (ja) * | 2017-02-28 | 2018-09-13 | パナソニックIpマネジメント株式会社 | 作業適正度判定システム |
JP2018194292A (ja) * | 2014-01-14 | 2018-12-06 | パナソニックIpマネジメント株式会社 | 環境制御システム |
-
2022
- 2022-03-30 WO PCT/JP2022/016058 patent/WO2023032335A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060064037A1 (en) * | 2004-09-22 | 2006-03-23 | Shalon Ventures Research, Llc | Systems and methods for monitoring and modifying behavior |
JP2010204926A (ja) * | 2009-03-03 | 2010-09-16 | Softbank Bb Corp | モニタリングシステム、モニタリング方法、およびプログラム |
JP2018194292A (ja) * | 2014-01-14 | 2018-12-06 | パナソニックIpマネジメント株式会社 | 環境制御システム |
JP2015223224A (ja) * | 2014-05-26 | 2015-12-14 | パナソニックIpマネジメント株式会社 | 集中度の評価装置、プログラム |
JP2018140162A (ja) * | 2017-02-28 | 2018-09-13 | パナソニックIpマネジメント株式会社 | 作業適正度判定システム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9800834B2 (en) | Method and system of group interaction by user state detection | |
US8243132B2 (en) | Image output apparatus, image output method and image output computer readable medium | |
Olugbade et al. | How can affect be detected and represented in technological support for physical rehabilitation? | |
Wang et al. | Automated student engagement monitoring and evaluation during learning in the wild | |
US9498123B2 (en) | Image recording apparatus, image recording method and image recording program stored on a computer readable medium | |
Fiorini et al. | Daily gesture recognition during human-robot interaction combining vision and wearable systems | |
KR102515987B1 (ko) | 학습자의 비대면 온라인수업 참여 탐지 장치 및 방법 | |
Revadekar et al. | Gauging attention of students in an e-learning environment | |
Yu et al. | Magic mirror table for social-emotion alleviation in the smart home | |
US20220160227A1 (en) | Autism treatment assistant system, autism treatment assistant device, and program | |
JP2023160899A (ja) | 集中度計測装置、集中度計測方法、およびプログラム | |
Al-Btoush et al. | New features for eye-tracking systems: Preliminary results | |
US20240379206A1 (en) | Monitoring and optimization of human-device interactions | |
US20210202078A1 (en) | Patient-Observer Monitoring | |
US20240004464A1 (en) | Transmodal input fusion for multi-user group intent processing in virtual environments | |
US10747308B2 (en) | Line-of-sight operation apparatus, method, and medical device | |
WO2023032335A1 (fr) | Système d'assistance, procédé d'assistance, et programme | |
Gogia et al. | Multi-modal affect detection for learning applications | |
JP7653607B2 (ja) | 集中度推定装置、集中度推定方法、及びプログラム | |
Soyel et al. | Towards an affect sensitive interactive companion | |
US11328174B2 (en) | Cluster classification device, environment generation device, and environment generation system | |
WO2023032617A1 (fr) | Système de détermination, procédé de détermination et programme | |
JP7607272B2 (ja) | 学習支援装置および学習支援システム | |
Pangestu et al. | The Heron Formula Approach for Head Movements Detection for Student Focus Detection during Pandemic Online Class | |
Behoora et al. | Quantifying emotional states based on body language data using non invasive sensors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22863925 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22863925 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |