+

WO2018060993A1 - Procédé et système d'analyse d'émotions pondérées par la personnalité - Google Patents

Procédé et système d'analyse d'émotions pondérées par la personnalité Download PDF

Info

Publication number
WO2018060993A1
WO2018060993A1 PCT/IL2017/051083 IL2017051083W WO2018060993A1 WO 2018060993 A1 WO2018060993 A1 WO 2018060993A1 IL 2017051083 W IL2017051083 W IL 2017051083W WO 2018060993 A1 WO2018060993 A1 WO 2018060993A1
Authority
WO
WIPO (PCT)
Prior art keywords
personality
emotion
individual
signal
classifier
Prior art date
Application number
PCT/IL2017/051083
Other languages
English (en)
Inventor
Itzhak Wilf
Shai Gilboa
David GAVRIEL
Original Assignee
Faception Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Faception Ltd. filed Critical Faception Ltd.
Publication of WO2018060993A1 publication Critical patent/WO2018060993A1/fr

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/63Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/28Determining representative reference patterns, e.g. by averaging or distorting; Generating dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/175Static expression

Definitions

  • the present invention relates to the field of emotion recognition systems. More particularly, the invention relates to a method and system for estimating emotion from one or more emotion- related signals in a manner that is weighted or moderated by an underlying personality.
  • the emotional state of an individual affects concentration, task solving and decision making skills, as well as behavior, reaction to situations, messages and external stimuli. Human-to-human and human-to-machine interactions are also affected by the emotional state of an individual.
  • Emotion recognition can increase the efficiency of medical treatment, rehabilitation, and counseling. It can help patients deal with stress, anxiety and depression.
  • E-learning one can adjust the presentation style of an online tutor. Monitoring the emotional state and detecting anger can prevent violence and improve call-center management.
  • emotion recognition can adjust content (e.g. music, games) to the mood of the users, and in the field of marketing it can increase the impact of advertisements by increasing attention and engagement.
  • Emotion recognition algorithms typically perform facial detection and semantic analysis to interpret mood from images, videos, text, and speech.
  • facial expression detection tools categorize emotions to Joy, Sadness, Anger, Fear, Surprise, Contempt, and Disgust.
  • Facial Emotion Databases With facial emotion detection, algorithms detect faces within an image or video, and sense expressions by analyzing the relation between points on the face of an individual, based on databases compiled in academic environments. Facial Emotion Databases
  • Emotion definition can be recognized in discrete emotion labels or on a continuous scale.
  • Many databases are based on the basic emotions theory, introduced by Paul Ekman, which assumes the existence of six discrete basic emotions: Anger, Fear, Disgust, Surprise, Joy and Sadness.
  • CK+ Extended Cohn-Kanade Dataset
  • CK+ Kanade, T., Cohn, J. F., & Tian, Y. (2000). Comprehensive Database for Facial Expression Analysis. Paper presented at the Fourth IEEE International Conference on Automatic Face and Gesture Recognition] comprises 593 posed image sequences (327 of which have discrete emotion labels), for 123 subjects, covering the 6 basic emotions plus neutral emotion.
  • Fig.l depicts a sample of the CK+ database, showing 7 basic emotions of 5 subjects.
  • Shan et al. [Shan, C, Gong, S., & McOwan, P. W. (2009). Facial expression recognition based on local binary patterns: A comprehensive study. Image and Vision Computing] investigate various combinations of face image descriptors with machine learning classifiers to identify emotion from 6 and 7-class (including neutral) expressions.
  • Shan et al. use Local Binary Patterns (LBP) as a basic descriptor, further suggesting partitioning of the face image into multiple windows, and selecting the most discriminative feature for each emotion, with a boosting method.
  • LBP Local Binary Patterns
  • Shan shows that the best results for the CK+ dataset can be achieved when using Support Vector Machine (SVM) classifier, which is a representation of the examples as points in space, mapped such that the examples of the separate categories are divided by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on which side of the gap they fall.
  • SVM Support Vector Machine
  • Zhao et al. proposes extracting a spatial-temporal motion local binary pattern (LBP) feature and integrating it with Gabor multi-orientation fusion histogram to give descriptors, which reflect static and dynamic texture information of facial expressions.
  • LBP spatial-temporal motion local binary pattern
  • Fig. 2 shows another method for facial emotion detection proposed by Pilla et al. [Pilla Jr, Valfredo & a I, (2016). Facial Expression Classification Using Convolutional Neural Network and Support Vector Machine, WVC 2016 - Workshop de Visao Computacional, At Campo Grande, Mato Grosso, Brazil].
  • Input comprises a 156-pixels-wide and 176-pixels-high greyscale image that is derived from the CK+ Dataset (stage a).
  • AlexNet Alex Krizhevsky et al, ImageNet Classification with Deep Convolutional Neural Networks, in Advances in Neural Information Processing Systems 25 (NIPS 2012)] is a deep convolutional neural network, trained to classify 1.3 million high-resolution images in the LSVRC-2010 ImageNet training set into the 1000 different classes. The network is often used to generate features for image classifiers in a process known as transfer learning.
  • Sentiment analysis processing software analyzes text to conclude if a statement is generally positive or negative based on keywords and their valence index.
  • US patent no. 8965762 proposes combining voice analysis and image analysis to provide an improved measure of emotion.
  • a customer relation representative is answering a video call by a customer, complaining about the quality of a service or a delivered product. Sometime during the call, the customer may be expressing anger or frustration. Clearly, such emotions are easier to detect if the customer is extrovert. However, an anger classifier trained and applied equally on all customers may fail to detect anger of an introvert customer.
  • the present invention relates to a method for detecting emotions of individuals according to the personality of the individuals, from one or more emotion-related signals.
  • the method comprises: a) training a personality-weighted emotion classifier;
  • the classifier is trained by:
  • the emotion signal and the at least one personality signal are selected from:
  • the at least one personality trait is selected from the big-5 personality traits.
  • the personality traits are predicted by a) performing face detection, alignment and cropping on one or more digital face images of an individual; b) performing feature extraction from the individual's digital face images as to provide image descriptors of the individual; and c) applying a classifier model obtained in a learning stage on the individual's image descriptors as to predict a personality trait of the individual.
  • the emotion is detected by an emotion detector that corresponds to the predicted personality trait level.
  • the personality trait level is predicted in real-time.
  • the personality trait level is predicted from previously gathered user data of the individual.
  • the emotion is detected by an unweighted emotion detector, and is multiplied by an emotion gain factor that corresponds to the predicted personality trait level.
  • the present invention relates to a system for detecting emotions according to the personality of the individual, comprising:
  • a memory comprising computer-readable instructions which when executed by the at least one processor causes the processor to execute a personality-weighted emotions detector, wherein the personality-weighted emotions detector:
  • the present invention relates a non-transitory computer-readable medium, comprising instructions which when executed by at least one processor causes the processor to detect emotions according to the personality of the individual.
  • the present invention relates to a personality-weighted emotions detector, comprising:
  • a personality-weighted emotions classifier configured to receive as input an emotion signal and to output an emotion detected in the emotion signal
  • a dataset acquiring module configured to obtain a dataset comprising a plurality of emotion- tagged training signals of a plurality of subjects
  • a personality trait prediction module configured to predict at least one personality trait from input signals of subjects and the level of each said personality trait
  • a processor configured to:
  • Fig. 1 shows a sample of the Extended Cohn-Kanade Dataset
  • Fig. 2 shows a method for facial emotion detection
  • Fig. 3 (prior art) is a flowchart describing a method of predicting personality traits from face images
  • Fig. 4 schematically illustrates a method for creating emotion training data for different levels of extroversion, according to an embodiment of the present invention
  • Fig. 5 is a flow chart describing the process of training emotion detection classifiers, according to an embodiment of the present invention.
  • Fig. 6 shows a diagram describing a personality weighted emotion analyzer, according to an embodiment of the present invention
  • Fig. 7 shows a diagram describing a personality-weighted or moderated emotion prediction, according to an embodiment of the invention.
  • Fig. 8 shows a diagram describing a method for personality-weighted emotion detection according to an embodiment of the invention.
  • the present invention provides a method for estimating emotions of individuals from one or more emotion-related signals in a manner that is weighted or moderated by one or more personality traits of each subject individual. It is noted that, in contrast to emotions such as anger, sadness or joy which are temporary states, human personality is for all practical purposes fixed. An introvert individual, for instance, will highly probably be introvert tomorrow and next month.
  • Fig. 3 is a flowchart describing the method of predicting personality traits from face images according to US 2015/0242707 Al. As indicated in this figure, the method includes a learning stage and a prediction stage.
  • the learning stage may involve the following procedure:
  • the provided digital images comprises personality metadata and are used as training data for the learning stage;
  • the prediction stage may involve the following procedure:
  • the present invention provides methods for detecting emotion or emotional state of an individual from one or more emotion-related signals in a manner that is weighted or moderated by the individual's personality.
  • a specific personality trait is selected - introversion / extroversion. It is clear that the present invention is not limited to the introversion / extroversion exemplary case, and can be applied to other traits from personality big-5 or other personality traits or a combination of traits.
  • digital image and video analysis can be used as input for predicting personality and emotion of an individual.
  • Other embodiments of the present invention may use input from other sources and/or types.
  • subject personality can be predicted from a subject individual's digital media (e.g., from social media such Facebook) followed by personality weighted emotion analysis based on image / video / audio signals / typing patterns or touch patterns (e.g., behavioral biometric of keystroke dynamics that uses the manner and rhythm in which an individual types characters on a keyboard or keypad, or the manner an individual interacts with a touchscreen or similar touch-based input devices).
  • a 3-level personality trait prediction is considered: introvert / extrovert / balanced. It is clear that this method can be further generalized, for instance, to a 5- level prediction model or, given enough training data, to a continuous measure of introversion.
  • Fig. 4 schematically illustrates a method for creating emotion training data for different levels of extroversion, using a facial emotion database.
  • a 3-level extroversion classifier 420 such as depicted in Fig. 3, is applied to the expression image of each test subject. If, for instance, a subject is classified as extrovert by classifier 420, then all of the subject's facial images / video image sequence tagged as happy is stored in the extrovert happiness image store. Once all subjects are classified as extroverted / balanced / introverted, the whole database has been partitioned into emotion-tagged facial images for introvert, extrovert and balanced individuals.
  • emotion detection classifiers are trained for each emotion and for each introversion value. For example, referring to Fig. 2, image features are extracted using the AlexNet for all extrovert images. The features are grouped into training sets according to the emotion type, and then an SVM Multi-Class Classifier is trained to predict emotion / expression from images of extroverts. Then the training process repeats for introverts and for subjects with balanced extroversion.
  • Fig. 5 is a flow chart describing the process of training emotion detection classifiers, according to an embodiment of the present invention, relying on personality-based partitioning of emotion training facial images, e.g. as shown in Fig. 4.
  • a dataset of emotion and personality signals are obtained, for instance by a dataset acquiring module adapted to obtain a dataset comprising a plurality of emotion-tagged training signals of a plurality of subjects.
  • the signals are facial images tagged by the emotion state expressed therein.
  • a personality classifier predicts the level of a personality trait of each subject individual in the dataset (e.g. introvert / balanced / extrovert), and the dataset is partitioned into subsets according to the personality trait level of each subject individual.
  • the signals are further partitioned into class images according to the emotion type.
  • features and descriptors of the signals of each emotion of each personality trait level subset are extracted, and in step 505 a classifier is trained for each personality trait level tagged emotion.
  • Fig. 6 shows a flowchart describing a personality weighted emotion analyzer, according to an embodiment of the present invention.
  • Personality-related signals 600 are processed by Introvert / Extrovert Signal Classifier 620 to determine the introversion level of the subject. That level serves as a selector for the specific emotion detector used to detect an emotion from emotion-related signals 610 that may include audio signals, face image / video signals, typing speed and pressure and so on.
  • Emotion-related signals are processed by Emotion Feature Vector Extraction 615 that represent the emotion-related variation of the signal in reduced form that can be classified.
  • Emotion Detector 630 for high introversion is used to detect that subject's emotion.
  • emotion detector 640 for balanced introversion is used to detect that subject's emotion.
  • emotion detector 650 for high extroversion is used to detect that subject's emotion.
  • emotion detector 650 i.e., high extroversion
  • the embodiment in figure 6 may be used to predict personality, and subsequently emotion in realtime. For a random / anonymous subject, that we have no prior knowledge about, both need be predicted during interaction time. This requires to predict personality in a manner that is not affected by emotion.
  • personality-related signals 600 may be provided to a temporal selector (not shown) that would for example identify (for a video based classifier) frames of least facial activity and use only those frames to predict personality as described in Fig. 3
  • Fig. 7 described another embodiment of personality-weighted or moderated emotion prediction.
  • personality traits such as introversion / extroversion are fixed and do not change (at least during any relevant time frame).
  • identity of the subject is known, and thus allows access to previously gathered user data such as: Facebook profile images, social media communications, and prior communication with service provider. Therefore personality related signals or even previously identified personality traits can be retrieved for that specific user. If user ID cannot be accessed as simply as caller ID, the user may be identified by signal-based identification methods as known in prior art, for example by face recognition.
  • personality profiles for the user base is stored in the User Profile Data Store 700. After identifying the user by caller ID or other means, the user personality profile 710 is sent to the Introvert / Extrovert metadata classifier 720 which is operated on relevant data collected and optionally analyzed before the real-time interaction. Then personality-weighted or moderated prediction of emotional state proceeds as described hereinabove with respect to Fig. 6
  • Fig. 8 shows a diagram describing a method for personality-weighted emotion detection according to another embodiment of the invention, that is based on the assumption that emotion signals (such as a happy or angry face) are of similar nature among individuals of different personalities, but the magnitude of said emotion signals depend on the subject personality. For example, happiness / anger will be expressed more strongly by extrovert individuals than by introvert individuals.
  • the embodiment is also based on the general structure of a classifier (such as Support Vector Machine) that produce scores (a number) that is later subject to a threshold.
  • a classifier such as Support Vector Machine
  • scores (a number) that is later subject to a threshold.
  • a large number of algorithms for classification can be phrased in terms of a linear function that assigns a score to each possible category k by combining the feature vector of an instance with a vector of weights, using a dot product.
  • the scores are normalized that the SVM gap is centered around zero, so that positive score indicate detection (of anger for example) and negative scores indicate non- anger.
  • Emotion-related signals 800 are processed by unit 805 to generate emotion feature vectors which are then passed through a classifier function 810 to generate emotion classifier score 810. In prior art, that score will be compared with a decision function (a threshold for example) to detect an emotion state or its absence.
  • a decision function a threshold for example
  • the emotion classifier score 810 is moderated by the subject personality as follows:
  • Personality related signals 820 are passed through a personality classifier (extroversion in this embodiment) 830 to predict the extroversion level (one of 3 or 5 levels, or a continuous number if the classifier 830 is a regressor).
  • the extroversion level is passed to the emotion gain calculator 840 which produces an emotion gain factor 845 which is multiplied by the emotion classifier score 815 via multiplier 850.
  • the resulting Personality-Weighted Emotion Score 860 is utilized by the Emotion Decision Layer (870) to detect the emotion. Additionally, Emotion Magnitude Predictor 880 predicts the magnitude of the emotion.
  • extroversion level moderates the estimated emotional state, so that anger will be better detected for introvert subjects, and minor expressions will not be classified as anger by an extrovert person.
  • a threshold yes / no decision
  • the magnitude of the emotion can be predicted as well, after such moderation.
  • the emotion gain parameter is derived by heuristic / rule-based considerations.
  • the emotion gain parameter is derived in a training process.
  • emotion signals are collected from training sets of personality traits (e.g. introvert / balanced / extrovert).
  • the personality-weighted emotion analysis described hereinabove may be performed by executable code and instructions stored in computer readable medium and running on one or more processor-based systems.
  • Embodiments of the invention may be implemented as a computer process or a computing system, such as a computer program product or computer readable media.
  • the computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
  • the computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process.
  • the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like.
  • the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote memory storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Medical Informatics (AREA)
  • Acoustics & Sound (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Software Systems (AREA)
  • Child & Adolescent Psychology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un procédé de détection d'émotions d'individus en fonction de la personnalité des individus, à partir d'un ou de plusieurs signaux liés à l'émotion. Le procédé comprend les étapes suivantes consistant : a) à former un classificateur d'émotions pondérées par personnalité ; b) à obtenir un signal d'émotion d'un individu et au moins un signal de personnalité correspondant ; c) à saisir le signal d'émotion et ledit signal de personnalité dans le classificateur ; et d) à détecter une émotion en fonction du signal d'émotion et en fonction d'au moins un niveau de trait de personnalité prédit à partir du signal de personnalité de l'individu.
PCT/IL2017/051083 2016-09-27 2017-09-27 Procédé et système d'analyse d'émotions pondérées par la personnalité WO2018060993A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662400279P 2016-09-27 2016-09-27
US62/400,279 2016-09-27

Publications (1)

Publication Number Publication Date
WO2018060993A1 true WO2018060993A1 (fr) 2018-04-05

Family

ID=61759307

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2017/051083 WO2018060993A1 (fr) 2016-09-27 2017-09-27 Procédé et système d'analyse d'émotions pondérées par la personnalité

Country Status (1)

Country Link
WO (1) WO2018060993A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583325A (zh) * 2018-11-12 2019-04-05 平安科技(深圳)有限公司 人脸样本图片标注方法、装置、计算机设备及存储介质
CN109829154A (zh) * 2019-01-16 2019-05-31 中南民族大学 基于语义的人格预测方法、用户设备、存储介质及装置
CN110166836A (zh) * 2019-04-12 2019-08-23 深圳壹账通智能科技有限公司 一种电视节目切换方法、装置、可读存储介质及终端设备
CN110390254A (zh) * 2019-05-24 2019-10-29 平安科技(深圳)有限公司 基于人脸的性格分析方法、装置、计算机设备及存储介质
CN110796020A (zh) * 2019-09-30 2020-02-14 深圳云天励飞技术有限公司 一种心情指数分析方法及相关装置
CN110889454A (zh) * 2019-11-29 2020-03-17 上海能塔智能科技有限公司 模型训练方法、装置、情绪识别方法、装置、设备与介质
CN110991344A (zh) * 2019-12-04 2020-04-10 陕西科技大学 一种基于深度学习的情绪舒缓系统
WO2020216064A1 (fr) * 2019-04-24 2020-10-29 京东方科技集团股份有限公司 Procédé de reconnaissance d'émotions de parole, procédé de reconnaissance sémantique, procédé de réponse à des questions, dispositif informatique et support de stockage lisible par ordinateur
CN111985243A (zh) * 2019-05-23 2020-11-24 中移(苏州)软件技术有限公司 情感模型的训练方法、情感分析方法、装置及存储介质
WO2021071155A1 (fr) 2019-10-08 2021-04-15 Samsung Electronics Co., Ltd. Appareil électronique et son procédé de commande
CN113221560A (zh) * 2021-05-31 2021-08-06 平安科技(深圳)有限公司 人格特质和情绪的预测方法、装置、计算机设备及介质
CN113361307A (zh) * 2020-03-06 2021-09-07 上海卓繁信息技术股份有限公司 一种人脸表情分类方法,装置及存储设备
CN113591525A (zh) * 2020-10-27 2021-11-02 蓝海(福建)信息科技有限公司 一种深度融合面部表情和语音的驾驶员路怒症识别方法
CN114203202A (zh) * 2020-09-17 2022-03-18 北京有限元科技有限公司 一种对话场景语音情绪识别方法、装置及计算设备
CN114241546A (zh) * 2021-11-18 2022-03-25 南通大学 一种基于多方向局部二值模式的人脸识别方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120002848A1 (en) * 2009-04-16 2012-01-05 Hill Daniel A Method of assessing people's self-presentation and actions to evaluate personality type, behavioral tendencies, credibility, motivations and other insights through facial muscle activity and expressions
US20130018837A1 (en) * 2011-07-14 2013-01-17 Samsung Electronics Co., Ltd. Emotion recognition apparatus and method
US20160163332A1 (en) * 2014-12-04 2016-06-09 Microsoft Technology Licensing, Llc Emotion type classification for interactive dialog system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120002848A1 (en) * 2009-04-16 2012-01-05 Hill Daniel A Method of assessing people's self-presentation and actions to evaluate personality type, behavioral tendencies, credibility, motivations and other insights through facial muscle activity and expressions
US20130018837A1 (en) * 2011-07-14 2013-01-17 Samsung Electronics Co., Ltd. Emotion recognition apparatus and method
US20160163332A1 (en) * 2014-12-04 2016-06-09 Microsoft Technology Licensing, Llc Emotion type classification for interactive dialog system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Faception App Will Instantly Tell if You're a Good Person", TECHCO, July 2016 (2016-07-01), XP055497503, Retrieved from the Internet <URL:https://tech.co/faception-facial-recognition-app-personality-2016-07> [retrieved on 20160731] *

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583325A (zh) * 2018-11-12 2019-04-05 平安科技(深圳)有限公司 人脸样本图片标注方法、装置、计算机设备及存储介质
CN109583325B (zh) * 2018-11-12 2023-06-27 平安科技(深圳)有限公司 人脸样本图片标注方法、装置、计算机设备及存储介质
CN109829154B (zh) * 2019-01-16 2023-04-28 中南民族大学 基于语义的人格预测方法、用户设备、存储介质及装置
CN109829154A (zh) * 2019-01-16 2019-05-31 中南民族大学 基于语义的人格预测方法、用户设备、存储介质及装置
CN110166836B (zh) * 2019-04-12 2022-08-02 深圳壹账通智能科技有限公司 一种电视节目切换方法、装置、可读存储介质及终端设备
CN110166836A (zh) * 2019-04-12 2019-08-23 深圳壹账通智能科技有限公司 一种电视节目切换方法、装置、可读存储介质及终端设备
WO2020216064A1 (fr) * 2019-04-24 2020-10-29 京东方科技集团股份有限公司 Procédé de reconnaissance d'émotions de parole, procédé de reconnaissance sémantique, procédé de réponse à des questions, dispositif informatique et support de stockage lisible par ordinateur
CN111985243A (zh) * 2019-05-23 2020-11-24 中移(苏州)软件技术有限公司 情感模型的训练方法、情感分析方法、装置及存储介质
CN111985243B (zh) * 2019-05-23 2023-09-08 中移(苏州)软件技术有限公司 情感模型的训练方法、情感分析方法、装置及存储介质
CN110390254A (zh) * 2019-05-24 2019-10-29 平安科技(深圳)有限公司 基于人脸的性格分析方法、装置、计算机设备及存储介质
CN110390254B (zh) * 2019-05-24 2023-09-05 平安科技(深圳)有限公司 基于人脸的性格分析方法、装置、计算机设备及存储介质
CN110796020A (zh) * 2019-09-30 2020-02-14 深圳云天励飞技术有限公司 一种心情指数分析方法及相关装置
CN110796020B (zh) * 2019-09-30 2022-03-25 深圳云天励飞技术有限公司 一种心情指数分析方法及相关装置
KR20210041757A (ko) * 2019-10-08 2021-04-16 삼성전자주식회사 전자 장치 및 그 제어 방법
KR102783867B1 (ko) 2019-10-08 2025-03-21 삼성전자주식회사 전자 장치 및 그 제어 방법
US11676419B2 (en) 2019-10-08 2023-06-13 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
WO2021071155A1 (fr) 2019-10-08 2021-04-15 Samsung Electronics Co., Ltd. Appareil électronique et son procédé de commande
EP4004696A4 (fr) * 2019-10-08 2022-09-07 Samsung Electronics Co., Ltd. Appareil électronique et son procédé de commande
US11538278B2 (en) 2019-10-08 2022-12-27 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
CN110889454A (zh) * 2019-11-29 2020-03-17 上海能塔智能科技有限公司 模型训练方法、装置、情绪识别方法、装置、设备与介质
CN110991344B (zh) * 2019-12-04 2023-02-24 陕西科技大学 一种基于深度学习的情绪舒缓系统
CN110991344A (zh) * 2019-12-04 2020-04-10 陕西科技大学 一种基于深度学习的情绪舒缓系统
CN113361307A (zh) * 2020-03-06 2021-09-07 上海卓繁信息技术股份有限公司 一种人脸表情分类方法,装置及存储设备
CN114203202A (zh) * 2020-09-17 2022-03-18 北京有限元科技有限公司 一种对话场景语音情绪识别方法、装置及计算设备
CN113591525A (zh) * 2020-10-27 2021-11-02 蓝海(福建)信息科技有限公司 一种深度融合面部表情和语音的驾驶员路怒症识别方法
CN113591525B (zh) * 2020-10-27 2024-03-01 蓝海(福建)信息科技有限公司 一种深度融合面部表情和语音的驾驶员路怒症识别方法
CN113221560B (zh) * 2021-05-31 2023-04-18 平安科技(深圳)有限公司 人格特质和情绪的预测方法、装置、计算机设备及介质
CN113221560A (zh) * 2021-05-31 2021-08-06 平安科技(深圳)有限公司 人格特质和情绪的预测方法、装置、计算机设备及介质
CN114241546A (zh) * 2021-11-18 2022-03-25 南通大学 一种基于多方向局部二值模式的人脸识别方法

Similar Documents

Publication Publication Date Title
WO2018060993A1 (fr) Procédé et système d&#39;analyse d&#39;émotions pondérées par la personnalité
CN113094578B (zh) 基于深度学习的内容推荐方法、装置、设备及存储介质
US20210118424A1 (en) Predicting personality traits based on text-speech hybrid data
Dupré et al. Accuracy of three commercial automatic emotion recognition systems across different individuals and their facial expressions
Kumar et al. Personality detection using kernel-based ensemble model for leveraging social psychology in online networks
Pathirana et al. A Reinforcement Learning-Based Approach for Promoting Mental Health Using Multimodal Emotion Recognition
Alipour et al. Ai-driven marketing personalization: Deploying convolutional neural networks to decode consumer behavior
Singh et al. A Survey on: Personality Prediction from Multimedia through Machine Learning
Tiwari et al. A novel approach for detecting emotion in text
Rodriguez-Meza et al. Recurrent neural networks for deception detection in videos
Sharma et al. Depression detection using multimodal analysis with chatbot support
CN117668758A (zh) 对话意图识别方法和装置、电子设备及存储介质
Singh et al. Facial Emotion Detection Using CNN-Based Neural Network
Dodd et al. Facial emotion recognition and the future of work
Nimitha et al. Deep learning based sentiment analysis on images
Thakkar et al. Automatic assessment of communication skill in real-world job interviews: A comparative study using deep learning and domain adaptation.
Wahid et al. Emotion detection using unobtrusive methods: An integrated approach
Goulart et al. Mapping Personality Traits through Keystroke Analysis.
Kanchana et al. Analysis of social media images to predict user personality assessment
Marry et al. Interview Brilliance: Harnessing AI for Confidence-Driven Evaluation
Rohilla et al. Comparative Analysis of Machine Learning and Deep Learning Techniques for Social Media-Based Stress Detection
US20240177523A1 (en) System and Methods of Predicting Personality and Morals through Facial Emotion Recognition
Rathor et al. The art of context classification and recognition of text conversation using CNN
Anbazhagan et al. Twitter Based Emotion Recognition Using Bi-LSTM
Krishna et al. Analyzing Personality Insights Through Machine Learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17855159

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17855159

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载