WO2018164298A1 - Dispositif de fourniture d'informations émotionnelles d'après des informations spatiales et temporelles - Google Patents
Dispositif de fourniture d'informations émotionnelles d'après des informations spatiales et temporelles Download PDFInfo
- Publication number
- WO2018164298A1 WO2018164298A1 PCT/KR2017/002562 KR2017002562W WO2018164298A1 WO 2018164298 A1 WO2018164298 A1 WO 2018164298A1 KR 2017002562 W KR2017002562 W KR 2017002562W WO 2018164298 A1 WO2018164298 A1 WO 2018164298A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- emotion
- time
- environment
- users
- Prior art date
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 72
- 238000004458 analytical method Methods 0.000 claims abstract description 21
- 230000002996 emotional effect Effects 0.000 claims description 22
- 238000005259 measurement Methods 0.000 claims description 16
- 230000007613 environmental effect Effects 0.000 claims description 11
- 238000000034 method Methods 0.000 claims description 11
- 230000003542 behavioural effect Effects 0.000 claims description 3
- 238000005286 illumination Methods 0.000 claims 1
- 238000011160 research Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 6
- 239000000284 extract Substances 0.000 description 5
- 230000001953 sensory effect Effects 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000009885 systemic effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
Definitions
- the present invention relates to an emotional information providing apparatus based on spatial and temporal information, and more particularly, to an emotional information providing apparatus based on spatial and temporal information for enhancing accessibility and utilization of emotional information.
- An object of the present invention is to provide an emotion information providing apparatus based on space and time information for improving accessibility and utilization of emotion information.
- the emotional information providing apparatus is environmental information, including temperature, humidity, illuminance, and noise information around a plurality of users, image information and a plurality of images of the environment surrounding the plurality of users
- An input unit for receiving emotion information of a user of the controller, a controller for matching spatial information and time information in which the environment information, image information and emotion information are generated with the environment information, image information and emotion information, and the spatial information and time information
- a database generator for storing a matched environment information, image information, and emotion information to generate a database
- an information output unit for analyzing the database according to search information received from a user terminal and outputting an analysis result.
- the emotion information is generated by comparing the bio-signal measurement data, behavioral measurement data, and environmental measurement data with respective reference values or using a vocabulary of web content usage data, wherein the emotional information includes a pleasant-discomfort axis and an awakening-relaxation axis. Can be categorized according to dimensional sensitivity.
- the controller may match an area and time at which the environment information and image information are generated with the environment information and image information, and match a generation area and time of data used to generate the emotion information with the emotion information. .
- the information output unit may calculate and output at least one of a total average value and a total change amount of each category of the database according to at least one of a local unit and a time unit included in the search information.
- the information output unit calculates at least one of an average value and a change amount for each category of the database according to at least one of a plurality of local units and a plurality of time units included in the search information, and then calculates the calculated total average value and total change amount. At least one of the comparison results may be generated and output.
- the user can grasp the objective feelings felt by people in a specific event or phenomenon.
- the user can use this information for marketing or as an objective data for research analysis.
- FIG. 1 is a block diagram of an emotion information providing apparatus according to an embodiment of the present invention.
- FIG. 2 is a diagram for describing two-dimensional emotion information according to the pleasant-discomfort axis and the arousal-relaxation axis according to an embodiment of the present invention.
- FIG. 3 is a flowchart illustrating a method of providing emotion information according to an embodiment of the present invention.
- FIG. 4 is a diagram for describing search information according to an embodiment of the present invention.
- FIG. 5 is a view for explaining a method of outputting an analysis result according to an exemplary embodiment of the present invention.
- FIG. 1 is a block diagram of an emotion information providing apparatus according to an embodiment of the present invention.
- the emotion information providing apparatus 100 includes an input unit 110, a controller 120, a database generator 130, and an information output unit 140.
- the input unit 110 receives environment information including temperature, humidity, illuminance, and noise information around a plurality of users, image information photographing a plurality of user surroundings, and emotion information about a plurality of users.
- the input unit 110 includes environmental information and an image including data about temperature, humidity, illuminance, noise, and gas around the user from a sensor such as a temperature measuring instrument, a humidity measuring instrument, an illuminance measuring instrument, a noise measuring instrument, a gas measuring instrument, and a camera. Receive information in real time. The input unit 110 receives the emotion information from the user terminal.
- the emotional information is generated by comparing biosignal measurement data, behavioral measurement data, and environmental measurement data with respective reference values or using a vocabulary of using web content usage data, and includes two-dimensional axes including a pleasant-discomfort axis and an awakening-relaxation axis. Categorized according to emotion.
- the controller 120 matches the spatial information and time information in which the environment information, the image information, and the emotion information are generated with the environment information, the image information, and the emotion information.
- the controller 120 matches the region and time at which the environment information and the image information are generated to the environment information and the image information, and the generation region and time of the data used to generate the emotion information to the emotion information.
- the database generator 130 generates a database by storing environment information, image information, and emotion information matched with spatial information and time information.
- the information output unit 140 analyzes the database according to the search information received from the user terminal and outputs the analysis result.
- the information output unit 140 calculates and outputs at least one of the total average value and the total change amount for each category of the database according to at least one of the region unit and the time unit included in the search information.
- the information output unit 140 calculates at least one of the average value and the change amount of each category of data for the user according to at least one of a plurality of local units and a plurality of time units included in the search information, and then calculates the total The result of comparison with at least one of the average value and the total change amount is generated and output.
- the information output unit 140 may categorize and output sensory stimuli according to environmental information and image information according to two-dimensional emotion.
- FIG. 2 is a diagram for describing two-dimensional emotion information according to the pleasant-discomfort axis and the arousal-relaxation axis according to an embodiment of the present invention.
- the two-dimensional emotion information is classified into a total of nine dimensional emotions according to the two-dimensional coordinate system of the comfort-discomfort axis (x axis) and the awakening-relaxation axis (y axis).
- space 1 is unpleasant-wake
- space 2 is awakening
- space 3 is pleasant-wake
- space 4 is unpleasant
- space 5 is neutral
- space 6 is pleasant
- 7 Burn space is discomfort-relaxation
- space eight is relaxation.
- Space nine is space of pleasure-relaxation.
- the emotion information may be generated by applying at least one of biometric signal measurement data, behavior measurement data, environmental measurement data, and web content usage data of the subject to an emotion analysis algorithm.
- the biosignal measurement data, behavior measurement data, environmental measurement data, and web content usage data may be the same data as the environmental information and the image information input by the input unit 110.
- the emotion analysis algorithm extracts heart rate variability (HRV) by analyzing the user's heart rate (PPG) or ballistocardiogram (BCG), and then averages and coherences the peak values of the heart rate variability. extract the ratio.
- HRV heart rate variability
- PPG heart rate
- BCG ballistocardiogram
- the emotion analysis algorithm may set any one of nine dimensional emotions for each time slot by comparing the average value and unity ratio of the extracted peak point with a preset reference value.
- the emotional analysis algorithm will use space 1, that is, the dimensional sensitivity of the discomfort-wake, to the corresponding time slot. It can be set as the dimensional sensitivity of.
- the emotional analysis algorithm is based on the subject's total moving distance, number of clusters, entropy, periodic movement pattern (transadian), transition time, location diversity from the GPS data. After extracting a variance, a total distance, and the like, the variance and the total distance are compared with each preset reference value, thereby setting any one of nine dimensional emotions for each time slot.
- the emotion analysis algorithm extracts the color elements (pixel values) constituting the image from the image data, extracts the magnitude of the noise from the noise measurement data, and compares it with each of the preset reference values. Any one can be set.
- the emotion analysis algorithm extracts the vocabulary of the web content used by the subject from the web content data and compares it with a previously stored database to set any one of nine dimensional emotions for each time slot.
- FIG. 3 is a flowchart illustrating a method of providing emotion information according to an embodiment of the present invention.
- the input unit 110 receives environmental information including temperature, humidity, illuminance, and noise information around a plurality of users, image information photographing a plurality of user surroundings, and emotion information about a plurality of users (S310). .
- control unit 120 matches the spatial information and time information in which the environment information, the image information, and the emotion information are generated with the environment information, the image information, and the emotion information (S320).
- the controller 120 matches the area and time at which the environment information and the image information are generated to the environment information and the image information, and the generation area and time of the data used to generate the emotion information to the emotion information. That is, the controller 120 tags the region and time information generated in the environment information and the image information.
- the database generation unit 130 generates a database by storing the environment information, image information and emotion information matched with the spatial information and time information (S230).
- the information output unit 140 analyzes the database according to the search information received from the user terminal and outputs the analysis result (S240).
- the information output unit 140 calculates and outputs at least one of the total average value and the total change amount for each category of the database according to at least one of the region unit and the time unit included in the search information.
- the information output unit 140 calculates at least one of the average value and the change amount for each category of the database according to at least one of the plurality of regional units and the plurality of time units included in the search information, and then calculates the total average value and the total value. A result of comparison with at least one of the amount of change is generated and output.
- the information output unit 140 may infer sensory stimuli for each category of the database according to at least one of a plurality of local units and a plurality of time units included in the search information.
- the information output unit 140 compares the average value for each category according to the search information with a preset threshold range, and compares the total value according to the two-dimensional coordinate system of the pleasant-discomfort axis (x-axis) and the awakening-relaxation axis (y-axis). It can be classified into nine sensory stimuli.
- the government output unit 140 may select the unpleasant-relaxing sensory stimulus by comparing 5 lux with a threshold range.
- FIG. 4 is a diagram for describing search information according to an embodiment of the present invention.
- the search information includes local information and date information.
- the date information includes year, month, day, and time information.
- the user may input Gangnam as the local information and May 2016 as the date information as the search information.
- the search information also includes output form information of data to be output by the user.
- the output form information of the data includes a map data view, an average data view, a change data view, a comparison average data view, and a comparison change data view.
- FIG. 5 is a view for explaining a method of outputting an analysis result according to an exemplary embodiment of the present invention.
- the average data view refers to an output form in which an average of data corresponding to local information or date information included in the search information is provided to the user as a graph or a numerical value.
- search information is inputted using local information as a whole region and date information as a full date. Then, as shown in (a) of FIG. 5, the information output unit 140 calculates an average value for each category of all regions and all date data, and then outputs it as a graph or a numerical value.
- the comparative average data view refers to an output form in which a mean of data corresponding to a plurality of local information or a plurality of date information included in the search information is provided to the user as a graph or a numerical value.
- the information output unit 140 calculates the average value for each category of the total region and the total date data and the average value for each category of the data tagged with Gangnam, 1 January 2016 all at 13:00. Then print it out as a graph or numerical value.
- the map data view refers to an output form in which emotional information is displayed on the map and provided to the user.
- the information output unit 140 calculates the average value of emotional information tagged with Seoul, January 5, 2017 in each administrative district, and then maps and outputs the color corresponding to the average value on the Seoul map.
- the change data view refers to an output form that provides a user with a graph or a numerical value of a change in data corresponding to local information or date information included in the search information.
- the information output unit 140 calculates the average value for each category of the tagged data of Busan, March 5, 2017 for each time zone, and outputs it as a graph or a numerical value.
- the comparison change data view refers to an output form in which a change trend of data corresponding to a plurality of local information or a plurality of date information included in the search information is provided to the user as a graph or a numerical value.
- the information output unit 140 calculates the average value for each category of data tagged with Seoul, January 1, 2016 for each time zone, and calculates the average value for each category of data tagged with Seoul January 1, 2017 for each time zone. do.
- the information output unit 140 outputs the calculated average values in graphs or numerical values, respectively.
- the user can grasp the objective feelings felt by people in a specific event or phenomenon.
- the user can use this information for marketing or as an objective data for research analysis.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Library & Information Science (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Finance (AREA)
- Development Economics (AREA)
- Accounting & Taxation (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Game Theory and Decision Science (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
L'invention concerne un dispositif permettant de fournir des informations émotionnelles d'après des informations spatiales et temporelles. L'invention concerne un dispositif de fourniture d'informations émotionnelles comprenant : une unité d'entrée permettant de recevoir des informations environnementales comprenant des informations sur la température, l'humidité, l'éclairement et le bruit autour d'une pluralité d'utilisateurs, des informations d'images obtenues en photographiant les environnements autour de la pluralité d'utilisateurs, ainsi que des informations émotionnelles pour la pluralité d'utilisateurs ; une unité de commande permettant d'apparier des informations spatiales et des informations temporelles, dans lesquelles les informations environnementales, les informations d'images et les informations émotionnelles sont générées, avec les informations environnementales, les informations d'images et les informations émotionnelles ; une unité de génération de base de données permettant de stocker les informations environnementales, les informations d'images et les informations émotionnelles, qui sont appariées avec les informations spatiales et les informations temporelles, afin de générer une base de données ; et une unité de sortie d'informations permettant de générer un résultat d'analyse après avoir analysé la base de données en fonction des informations de recherche reçues d'un terminal utilisateur. Ainsi, comme il est possible de fournir des situations que des personnes rencontrent selon la zone et le moment, les informations émotionnelles correspondant auxdites situations, les utilisateurs peuvent saisir une émotion objective ressentie par des personnes lors d'un événement ou d'un phénomène spécifique. L'invention est avantageuse en ce que les utilisateurs peuvent utiliser de telles informations dans le marketing ou comme données objectives pour la recherche.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2017/002562 WO2018164298A1 (fr) | 2017-03-09 | 2017-03-09 | Dispositif de fourniture d'informations émotionnelles d'après des informations spatiales et temporelles |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2017/002562 WO2018164298A1 (fr) | 2017-03-09 | 2017-03-09 | Dispositif de fourniture d'informations émotionnelles d'après des informations spatiales et temporelles |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018164298A1 true WO2018164298A1 (fr) | 2018-09-13 |
Family
ID=63447891
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2017/002562 WO2018164298A1 (fr) | 2017-03-09 | 2017-03-09 | Dispositif de fourniture d'informations émotionnelles d'après des informations spatiales et temporelles |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018164298A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111881348A (zh) * | 2020-07-20 | 2020-11-03 | 百度在线网络技术(北京)有限公司 | 信息处理方法、装置、设备以及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100822029B1 (ko) * | 2007-01-11 | 2008-04-15 | 삼성전자주식회사 | 모바일 단말기에서의 사용자 히스토리를 이용한 개인화서비스 방법 및 그 시스템 |
KR20090129707A (ko) * | 2008-06-13 | 2009-12-17 | (주)시지웨이브 | 감성 어휘 의미 구조를 이용한 대표 감성 어휘 추출 방법 |
KR20120045459A (ko) * | 2010-10-29 | 2012-05-09 | 아주대학교산학협력단 | 라이프 케어 서비스 제공 시스템 |
KR101334894B1 (ko) * | 2012-02-15 | 2013-12-05 | 상명대학교서울산학협력단 | 위치 기반 감성 인식 방법 및 이를 적용하는 시스템 |
KR20160029375A (ko) * | 2014-09-05 | 2016-03-15 | 삼성전자주식회사 | 사용자의 생체신호를 모니터링 및 분석하는 장치 및 방법 |
-
2017
- 2017-03-09 WO PCT/KR2017/002562 patent/WO2018164298A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100822029B1 (ko) * | 2007-01-11 | 2008-04-15 | 삼성전자주식회사 | 모바일 단말기에서의 사용자 히스토리를 이용한 개인화서비스 방법 및 그 시스템 |
KR20090129707A (ko) * | 2008-06-13 | 2009-12-17 | (주)시지웨이브 | 감성 어휘 의미 구조를 이용한 대표 감성 어휘 추출 방법 |
KR20120045459A (ko) * | 2010-10-29 | 2012-05-09 | 아주대학교산학협력단 | 라이프 케어 서비스 제공 시스템 |
KR101334894B1 (ko) * | 2012-02-15 | 2013-12-05 | 상명대학교서울산학협력단 | 위치 기반 감성 인식 방법 및 이를 적용하는 시스템 |
KR20160029375A (ko) * | 2014-09-05 | 2016-03-15 | 삼성전자주식회사 | 사용자의 생체신호를 모니터링 및 분석하는 장치 및 방법 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111881348A (zh) * | 2020-07-20 | 2020-11-03 | 百度在线网络技术(北京)有限公司 | 信息处理方法、装置、设备以及存储介质 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Khomami et al. | Persian sign language recognition using IMU and surface EMG sensors | |
Sahoo et al. | Sign language recognition: State of the art | |
Poppe et al. | AMAB: Automated measurement and analysis of body motion | |
CN110363129A (zh) | 基于微笑范式和音视频行为分析的孤独症早期筛查系统 | |
WO2023043012A1 (fr) | Procédé de biorégulation utilisant un contenu d'images, programme informatique et système | |
Novais et al. | The role of non-intrusive approaches in the development of people-aware systems | |
Kadar et al. | Affective computing to enhance emotional sustainability of students in dropout prevention | |
WO2023003284A1 (fr) | Système d'examen psychologique basé sur l'intelligence artificielle et son procédé de fonctionnement | |
CN110364260A (zh) | 基于指示性语言范式的孤独症早期评估装置及系统 | |
WO2019231038A1 (fr) | Procédé de fourniture de contenu de beauté | |
WO2018164298A1 (fr) | Dispositif de fourniture d'informations émotionnelles d'après des informations spatiales et temporelles | |
Minkin | Including Information-Physical Quantities of Personality Traits into the International System of Units (SI) | |
WO2012115294A1 (fr) | Dispositif d'intergiciel d'apprentissage ubiquitaire pour générer un index d'émotions d'étude associé à un niveau de concentration d'étude à partir d'un index d'émotions de signaux biologiques et d'informations de contexte | |
WO2012115295A1 (fr) | Dispositif d'amélioration d'efficacité d'étude d'apprentissage ubiquitaire pour améliorer l'efficacité d'étude d'un utilisateur en fonction d'un index d'émotions d'étude généré à partir d'un index d'émotions de signaux biologiques et d'informations de contexte | |
Ebner et al. | Depth map color constancy | |
WO2022010149A1 (fr) | Procédé et système de génération d'ensemble de données relatives à des expressions faciales, et support d'enregistrement non transitoire lisible par ordinateur | |
WO2015178549A1 (fr) | Procédé et appareil pour la fourniture d'un service de sécurité en utilisant un seuil de stimulation ou moins | |
WO2019168220A1 (fr) | Procédé et dispositif de diagnostic de personnalité de marque à l'aide d'une carte de personnalité de marque | |
Winoto et al. | The development of a Kinect-based online socio-meter for users with social and communication skill impairments: a computational sensing approach | |
Roh et al. | Performance analysis of face recognition algorithms on Korean face database | |
Kuliga et al. | From Real to Virtual and Back: A multi-method approach for investigating the impact of urban morphology on human spatial experiences | |
Cavanagh | The perception of form and motion | |
WO2019199035A1 (fr) | Système et procédé de suivi du regard | |
Kray et al. | Taming context: A key challenge in evaluating the usability of ubiquitous systems | |
WO2019054598A1 (fr) | Procédé de suivi oculaire et terminal utilisateur pour réaliser celui-ci |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17899620 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17899620 Country of ref document: EP Kind code of ref document: A1 |