WO2020036345A1 - Dispositif de fabrication de produits cosmétiques - Google Patents
Dispositif de fabrication de produits cosmétiques Download PDFInfo
- Publication number
- WO2020036345A1 WO2020036345A1 PCT/KR2019/009540 KR2019009540W WO2020036345A1 WO 2020036345 A1 WO2020036345 A1 WO 2020036345A1 KR 2019009540 W KR2019009540 W KR 2019009540W WO 2020036345 A1 WO2020036345 A1 WO 2020036345A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- type
- cosmetic
- manufacturing device
- processor
- image
- Prior art date
Links
- 239000002537 cosmetic Substances 0.000 title claims abstract description 132
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 119
- 238000000034 method Methods 0.000 claims description 26
- 230000001815 facial effect Effects 0.000 claims description 17
- 238000002156 mixing Methods 0.000 claims description 12
- 238000006243 chemical reaction Methods 0.000 claims description 10
- 239000006188 syrup Substances 0.000 claims description 10
- 235000020357 syrup Nutrition 0.000 claims description 10
- 238000013528 artificial neural network Methods 0.000 claims description 7
- 238000013135 deep learning Methods 0.000 claims description 5
- 230000009466 transformation Effects 0.000 claims description 5
- 239000011148 porous material Substances 0.000 claims description 4
- 230000037303 wrinkles Effects 0.000 claims description 4
- 239000000428 dust Substances 0.000 claims description 3
- 238000010801 machine learning Methods 0.000 claims description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims 1
- 238000004891 communication Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 210000001061 forehead Anatomy 0.000 description 6
- 238000013507 mapping Methods 0.000 description 4
- 239000003921 oil Substances 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 238000007599 discharging Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000003058 natural language processing Methods 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 230000019612 pigmentation Effects 0.000 description 2
- 206010013786 Dry skin Diseases 0.000 description 1
- 206010039792 Seborrhoea Diseases 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- -1 blemishes Substances 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 238000013329 compounding Methods 0.000 description 1
- 239000008406 cosmetic ingredient Substances 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000037336 dry skin Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 239000003906 humectant Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000037312 oily skin Effects 0.000 description 1
- 231100000241 scar Toxicity 0.000 description 1
- 210000002966 serum Anatomy 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000475 sunscreen effect Effects 0.000 description 1
- 239000000516 sunscreening agent Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Definitions
- the present invention relates to a portable or stationary personalized cosmetic manufacturing apparatus capable of microfluidic control based on real-time sensor data.
- Another existing technique is the customized cosmetic makers 21 and 23 for shops.
- Customized cosmetic makers 21 and 23 for shops may measure a skin condition using a paperweight or a skin measurement device in a shop, and provide a finished cosmetic using a dispenser-type cosmetic maker as shown in FIG. 2.
- the device is a large dispenser type device used in the shop as well, and does not provide cosmetics in consideration of changing skin conditions and surrounding environment in real time.
- An object of the present invention is to provide a personalized cosmetic manufacturing apparatus reflecting the user's skin condition and the surrounding environment in real time.
- An object of the present invention is to provide a cosmetic manufacturing apparatus capable of personalized precision fluid control based on real-time sensing data.
- the cosmetic manufacturing apparatus obtains skin condition data and surrounding environment data for each of a plurality of region regions constituting a face of a user and a driver including one or more motors, and the skin condition data and
- the processor may include a processor configured to determine a manufacturing type of the cosmetic using the surrounding environment data and to control the driving unit to manufacture and discharge the cosmetic according to the determined manufacturing type.
- a personalized cosmetics optimized for an individual may be provided in real time.
- personalized cosmetics can be used at any time, even on the go or in an office environment.
- FIGS. 1 and 2 are diagrams illustrating examples of cosmetic makers according to the prior art.
- FIG. 3 is a block diagram illustrating a configuration of a cosmetic manufacturing apparatus according to an embodiment of the present invention.
- FIG. 4 is a flowchart illustrating a method of operating a cosmetic manufacturing apparatus for manufacturing cosmetics based on sensing data according to an embodiment of the present invention.
- FIG. 5 is a diagram illustrating a process of extracting a plurality of partial images from a facial image photographed by a camera according to an exemplary embodiment.
- FIG. 6 is a diagram for describing an image conversion model for converting partial images of a first type into second and third types by using an image conversion model, according to an exemplary embodiment.
- FIG. 7 is a view illustrating a skin care model according to an embodiment of the present invention.
- FIG. 8 is a front view of the cosmetic manufacturing apparatus 100-1 of the stationary type
- FIG. 9 is a perspective view of the cosmetic manufacturing apparatus 100-1 of the stationary type.
- 10 to 12 are views for explaining a portable type cosmetic manufacturing apparatus according to an embodiment of the present invention.
- FIG. 13 is a view for explaining the principle of discharging cosmetic products according to the determined cosmetic production type.
- 14A to 14D are diagrams for explaining an example of providing information on facial skin care of a user through a cosmetic manufacturing apparatus according to an embodiment of the present invention.
- FIG. 3 is a block diagram illustrating a configuration of a cosmetic manufacturing apparatus according to an embodiment of the present invention.
- the cosmetic manufacturing apparatus 100 may include a communication unit 110, an input unit 120, a driving unit 130, a sensing unit 140, an output unit 150, a cosmetic accommodating unit 160, and a memory 170. ), A processor 180, and a discharge unit 190.
- the communication unit 110 may transmit and receive data with an external device using a wired or wireless communication technology.
- the communication unit 110 may transmit / receive sensor information, a user input, a learning model, a control signal, and the like with an external device.
- the communication technology used by the communication unit 110 may include Global System for Mobile Communication (GSM), Code Division Multi Access (CDMA), Long Term Evolution (LTE), 5G, Wireless LAN (WLAN), and Wireless-Fidelity (Wi-Fi). ), Bluetooth (Bluetooth®), Radio Frequency Identification (RFID), Infrared Data Association (IrDA), ZigBee, Near Field Communication (NFC), and the like.
- GSM Global System for Mobile Communication
- CDMA Code Division Multi Access
- LTE Long Term Evolution
- 5G Fifth Generation
- Wi-Fi Wireless-Fidelity
- Bluetooth Bluetooth
- RFID Radio Frequency Identification
- IrDA Infrared Data Association
- ZigBee ZigBee
- NFC Near Field Communication
- the input unit 120 may acquire various types of data.
- the input unit 120 may include a camera for inputting an image signal, a microphone for receiving an audio signal, a user input unit for receiving information from a user, and the like.
- a signal obtained from the camera or microphone may be referred to as sensing data or sensor information by treating the camera or microphone as a sensor.
- the input unit 120 may include a camera 121 for inputting an image signal, a microphone 122 for receiving an audio signal, and a user input unit 123 for receiving information from a user. have.
- the voice data or the image data collected by the input unit 120 may be analyzed and processed as a user's control command.
- the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
- the cosmetic manufacturing apparatus 100 may include one or more.
- Cameras 121 may be provided.
- the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
- the processed image frame may be displayed on the display unit 151 or stored in the memory 170.
- the microphone 122 processes external sound signals into electrical voice data.
- the processed voice data may be variously used according to a function (or an application program being executed) performed by the cosmetic manufacturing apparatus 100. Meanwhile, various noise removal algorithms may be applied to the microphone 122 to remove noise generated in the process of receiving an external sound signal.
- the user input unit 123 is for receiving information from a user.
- the processor 180 may control an operation of the cosmetic manufacturing apparatus 100 to correspond to the input information. have.
- the user input unit 123 may be a mechanical input means (or a mechanical key, for example, a button, a dome switch, a jog wheel, a jog switch, etc., located on the front / rear or side surfaces of the terminal 100) and It may include a touch input means.
- the touch input means may include a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or a portion other than the touch screen. It may be made of a touch key disposed in the.
- the driver 130 may move the component elements stored in the cosmetic accommodating part 160 downward according to the determined manufacturing type.
- the moved component elements may be discharged to the outside through the discharge unit 190.
- the sensing unit 140 may acquire at least one of internal information of the cosmetic manufacturing apparatus 100, surrounding environment information of the cosmetic manufacturing apparatus 100, and user information using various sensors.
- the sensors included in the sensing unit 140 include a proximity sensor, an illumination sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an RGB sensor, an IR sensor, a fingerprint sensor, an ultrasonic sensor, an optical sensor, a microphone, and a li. , Radar and so on.
- the sensing unit 140 may include a moisture sensor for measuring the moisture content of the skin and an oil sensor for measuring the oil content of the skin.
- the sensing unit 140 may further include sensors for measuring a state of the surrounding environment, such as a temperature sensor, a humidity sensor, an ultraviolet sensor, and a dust sensor.
- sensors for measuring a state of the surrounding environment such as a temperature sensor, a humidity sensor, an ultraviolet sensor, and a dust sensor.
- the output unit 150 may generate an output related to visual, auditory, or tactile.
- the output unit 150 may include a display unit for outputting visual information, a speaker for outputting auditory information, and a haptic module for outputting tactile information.
- the output unit 150 includes at least one of a display unit 151, a sound output unit 152, a haptic module 153, and an optical output unit 154. can do.
- the display unit 151 displays (outputs) information processed by the cosmetic manufacturing apparatus 100.
- the display unit 151 may display execution screen information of an application program driven by the cosmetic manufacturing apparatus 100, or UI (User Interface) or Graphic User Interface (GUI) information according to the execution screen information.
- UI User Interface
- GUI Graphic User Interface
- the display unit 151 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
- a touch screen may function as a user input unit 123 that provides an input interface between the cosmetic manufacturing apparatus 100 and the user, and may provide an output interface between the cosmetic manufacturing apparatus 100 and the user.
- the sound output unit 152 may output audio data received from the communication unit 110 or stored in the memory 170 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
- the sound output unit 152 may include at least one of a receiver, a speaker, and a buzzer.
- the haptic module 153 generates various haptic effects that a user can feel.
- a representative example of the tactile effect generated by the haptic module 153 may be vibration.
- the light output unit 154 outputs a signal for notifying occurrence of an event by using light of a light source of the cosmetic manufacturing apparatus 100.
- the cosmetic container 160 may receive a plurality of component elements.
- the cosmetic accommodating part 160 may include a plurality of cartridges. Each of the plurality of cartridges may receive each of the plurality of component elements.
- the memory 170 may store data supporting various functions of the cosmetic manufacturing apparatus 100.
- the memory 170 may store input data acquired by the input unit 120, sensing data measured by the sensing unit 140, and the like.
- the processor 180 may control the components of the cosmetic manufacturing apparatus 100.
- the processor 180 may request, retrieve, receive, or utilize data of the memory 170, and execute the cosmetic operation to execute a predicted or desirable operation among the at least one executable operation.
- the components of 100 may be controlled.
- the processor 180 may generate a control signal for controlling the corresponding external device and transmit the generated control signal to the corresponding external device.
- the processor 180 may obtain intention information about the user input, and determine the user's requirements based on the obtained intention information.
- the processor 180 uses at least one of a speech to text (STT) engine for converting a voice input into a string or a natural language processing (NLP) engine for obtaining intention information of a natural language. Intent information corresponding to the input can be obtained.
- STT speech to text
- NLP natural language processing
- the processor 180 may collect history information including the operation contents of the cosmetic manufacturing apparatus 100 or the user's feedback about the operation, etc., and store the information in the memory 170 or transmit the information to an external device.
- the processor 180 may control at least some of the components of the cosmetic manufacturing apparatus 100 to drive an application program stored in the memory 170. In addition, the processor 180 may operate two or more of the components included in the cosmetic manufacturing apparatus 100 in combination with each other to drive the application program.
- FIG. 4 is a flowchart illustrating a method of operating a cosmetic manufacturing apparatus for manufacturing cosmetics based on sensing data according to an embodiment of the present invention.
- the processor 180 obtains skin condition data and surrounding environment data (S401).
- the processor 180 may obtain skin condition data and surrounding environment data through the sensing unit 140 or the camera 121.
- the skin condition data may include state information of the face of the user.
- the facial state information may include one or more of moisture content, pore state, wrinkle state, trouble state, and color of each of the plurality of region areas constituting the face.
- the plurality of region regions may include a forehead region, a left cheek region, a right cheek region, a nose region, and a jaw region.
- the moisture sensor provided in the sensing unit 140 may measure the amount of moisture of each of the plurality of region areas constituting the face.
- the moisture sensor can include two or more electrodes and can measure the capacitance and impedance of each site region.
- the moisture sensor may measure the moisture content of the skin by using the principle of better electricity flow.
- the moisture sensor may be paired with two electrodes to measure capacitance and impedance between the electrodes.
- the measured capacitances and impedances may correspond to specific amounts of moisture.
- the processor 180 may collect face state information through the camera 121.
- the camera 121 may photograph the face of the user, and the processor 180 may analyze the state of each of the plurality of region areas constituting the face by analyzing the photographed face image.
- FIG. 5 is a diagram illustrating a process of extracting a plurality of partial images from a facial image photographed by a camera according to an exemplary embodiment.
- the first type of facial image 500 captured by the camera 121 is illustrated.
- the first type of face image 500 may be an RGB image.
- the processor 180 may classify the first type of face image 500 into a plurality of first types of partial images 510 to 550 using a block extraction technique.
- the block extraction technique may be a technique of classifying a face image into a plurality of blocks, extracting feature points from each of the classified blocks, and extracting main region images from a face image.
- the processor 180 may classify the facial image into main area regions based on the feature points, and extract the main area image through the coordinates of each classified main area region.
- facial image 900 may be used.
- the classified plurality of first types of partial images may include a forehead image 510, a nose image 520, a jaw image 530, a left ball image 540, and a right ball image 550.
- FIG. 6 is a diagram for describing an image conversion model for converting partial images of a first type into second and third types by using an image conversion model, according to an exemplary embodiment.
- the image transformation model may be a trained artificial neural network-based model using a deep learning algorithm or a machine learning algorithm.
- the image transformation model may use a known deep learning-based Genetic Adversarial Networks (GAN) algorithm.
- GAN Genetic Adversarial Networks
- a generative host neural network may be an artificial neural network that learns through competition of two neural network models and outputs a result. Both neural network models include constructors and discriminators.
- the generator learns the actual data, and based on this, generates false data, and the discriminator can learn to determine whether the generator's false data is real or false.
- the constructor learns the data that did not deceive the discriminator and the discriminator receives the data deceived by the constructor. By repeating this process, more realistic false data may be generated.
- the image conversion model may be a model for converting an RGB image into an IR image or an RGB image into a UV image through a mapping function.
- the mapping function may be a function used to map the RGB image and the IR image on the topology, or to map the RGB image and the UV image on the topology.
- the image conversion model may generate a near-real IR image or a UV image corresponding to the RGB image in this manner.
- a first type partial image set 610 including a plurality of first types of partial images 611 to 615 may be input to the image conversion model 630.
- the image conversion model 630 converts the first type partial image set 610 into the second type partial image set 650 and the third type partial image set 670 using the generating hostile neural network. Can be.
- the image transformation model 630 may be trained to determine optimal constants of the mapping function used to convert the first type of image into a second type of image.
- the image transformation model 630 may be trained to determine optimal constants of the mapping function used to convert the first type of image into a third type of image.
- the second type of partial image set 650 includes IR images of partial images corresponding to each of the plurality of first types of partial images 611 to 615
- the third type of partial image set 670 includes a plurality of images.
- UV images may include partial images of each of the first types of partial images 611 to 615.
- IR type and UV type images can be used to predict the skin condition of each site.
- the processor 180 may predict a skin condition of each region area by using a skin care model.
- the skin care model may be a model that determines values of skin condition variables indicating a condition of the skin based on input data of the second type and the partial type of the third type.
- Variables that indicate the condition of the skin may include oil, blemishes, pores, wrinkles, damage, pigmentation, elasticity, and moisture.
- the skin care model may be a model based on artificial neural networks learned through a deep learning algorithm or a machine learning algorithm.
- the processor 180 may obtain the values of the skin condition variables as output data by using the second type of partial image and the third type of partial image as input data of the skin care model.
- the skin care model can infer the value of the optimal skin condition variables through supervised learning.
- the skin care model will be described with reference to FIG. 12.
- FIG. 7 is a view illustrating a skin care model according to an embodiment of the present invention.
- the training data for learning the skin care model 700 may include values of the second and third types of partial image data (partial image data set) and skin condition variables corresponding to a part constituting the face.
- the labeling data labeled in the partial image data set may be values of skin condition variables.
- a skin condition result indicating a skin condition may be output as a target feature vector.
- pixel image data of a pixel unit may be input to the skin care model 700.
- the skin care model 700 can be learned to minimize the value of the cost function corresponding to the difference between the outputted object feature vector and the labeled skin condition.
- the values of the parameters making up the cost function can be optimized.
- the target feature point of the skin care model 700 may be configured as an output layer including a plurality of nodes representing skin conditions.
- the artificial neural network of the skin care model 700 may be composed of a pixel-wise attribute-aware correlation map or a convolutional neural network (CNN).
- CNN convolutional neural network
- the IR type partial image 710 and the UV type partial image 730 corresponding to the forehead may be input to the previously learned skin care model 1200.
- the skin care model 700 may output a set 750 of skin state variables indicative of skin condition as the IR type partial image 710 and UV type partial image 730 are input.
- the set of values of the skin condition variables 750 may include the degree of oiliness, the degree of wrinkles, the size of pores, the degree of blemishes, the number of scar spots, the degree of pigmentation, the elasticity, and the degree of moisture.
- the skin care model 1200 may be learned by an external server.
- the cosmetic manufacturing apparatus 100 may receive the skin care model 700 from an external server, and store the received skin care model 700 in the memory 170.
- the cosmetic manufacturing apparatus 100 may transmit an image captured by the camera 121 to an external server.
- the external server may store the previously trained image conversion model and skin care model, and use the stored models to analyze the received image and infer the skin condition according to the analysis result.
- the processor 180 may measure the color of each part constituting the face through a color sensor (not shown) provided in the sensing unit 140.
- Ambient environment data may include temperature, humidity, ultraviolet index, fine dust concentration.
- the sensing unit 140 may include a temperature sensor, a humidity sensor, an ultraviolet sensor, and a Munje sensor for collecting ambient environment data.
- the processor 180 may receive ambient environment data from the external server through the communication unit 110.
- the processor 180 determines a manufacturing type of the cosmetic using the obtained skin condition data and the surrounding environment data (S403).
- the type of manufacture of the cosmetic may include the blending ratio of the component elements of the cosmetic and the amount of each component element.
- the processor 180 may determine the manufacturing type of the cosmetic product as the first manufacturing type.
- the processor 180 may determine the manufacturing type of the cosmetic product as the second manufacturing type.
- the blending ratio of the component elements of the first manufacturing type and the blending ratio of the component elements of the second manufacturing type may be different.
- the amount of each component element of the first manufacturing type and the amount of each component element of the second manufacturing type may be different from each other.
- the processor 180 may assign a different manufacturing type to each of the plurality of region regions constituting the face.
- the processor 180 may determine the manufacturing type of the cosmetic for the area as the first manufacturing type based on the state data of the forehead area and the surrounding environment data.
- the processor 180 may determine the manufacturing type of the cosmetic for the nose area as the second manufacturing type based on the state data of the nose area and the surrounding environment data.
- the processor 180 may determine the manufacturing type of the cosmetic for the left cheek area as the third manufacturing type based on the state data and the surrounding environment data of the left cheek area.
- the processor 180 may determine the manufacturing type of the cosmetic for the right cheek area as the fourth manufacturing type based on the state data of the right cheek area and the surrounding environment data.
- the processor 180 may determine the manufacturing type of the cosmetic for the jaw area as the fifth manufacturing type based on the state data and the surrounding environment data of the jaw area.
- the processor 180 may determine a blending ratio and a blending amount of the humectant and the oil based on the amount of moisture measured in each of the plurality of region areas constituting the face.
- the processor 180 may determine the blending ratio and the blending amount of the functional syrup based on the state of each of the plurality of region regions constituting the face.
- the processor 180 may determine the blending amount of the sunscreen based on the surrounding environment data.
- the processor 180 may determine the blending amount of the foundation based on the skin tone measured by the color sensor.
- the processor 180 manufactures cosmetics according to the determined production type of cosmetics (S405), and manufactures the cosmetics Discharge (S407).
- the processor 180 may control the plurality of motors connected to the plurality of cartridges to manufacture the cosmetic according to the determined blending ratio of the component elements of the cosmetic product and the amount of each component element.
- Each of the plurality of cartridges may correspond to each of the plurality of component elements.
- Each of the plurality of cartridges may hold each of the plurality of component elements.
- the processor 180 may adjust the pressure applied to each of the plurality of motors according to the determined manufacturing type.
- customized cosmetics may be provided to the user in real time.
- the skin care of the user can be made more efficient.
- the processor 180 may sequentially dispense cosmetics with a bar on each of the plurality of area regions of the face. For example, the cosmetic manufacturing apparatus 100 may first discharge the first cosmetic with a bar on the forehead area, and sequentially discharge the second cosmetic with a bar on the nose area.
- an optimized cosmetic product for each part of the user's face may be provided, and the satisfaction of the user's face skin care may be greatly improved.
- FIG. 8 and FIG. 9 are diagrams for describing a cosmetic apparatus of a stationary type according to an embodiment of the present invention.
- FIG. 8 is a front view of the cosmetic manufacturing apparatus 100-1 of the stationary type
- FIG. 9 is a perspective view of the cosmetic manufacturing apparatus 100-1 of the stationary type.
- the cosmetics manufacturing apparatus 100-1 of the stationary type may include all of the components described in FIG. 3.
- the stationary cosmetic manufacturing apparatus 100-1 includes a main body case 103, a camera 121 for photographing a user's face, a mirror 101, a display unit 151, It may include a moisture sensor 141, a plurality of cartridges 161 to 165 for accommodating cosmetics, and a discharge unit 190 including a plurality of nozzles.
- the moisture sensor 141 is a sensor for measuring the amount of moisture on the face and may have a pen type shape. However, the present disclosure is not limited thereto, and the moisture sensor 141 may have a mask type or a comb type.
- the moisture sensor 141 may be detachable from the cradle type cosmetic manufacturing apparatus 100-1.
- the moisture sensor 141 is detached from the stationary cosmetic manufacturing apparatus 100-1, the moisture content of each of the plurality of site regions constituting the measured face is transmitted to the cosmetic manufacturing apparatus 100-1 of the stationary type. Can be.
- each of the moisture sensor 141 and the stationary cosmetic manufacturing apparatus 100-1 may include a short range wireless communication module.
- the short range wireless communication standard used may be a Bluetooth standard, but this is only an example.
- the facial image captured by the camera 121 may be used to measure the skin condition of the face. The description of this is replaced with the description of FIG. 4.
- the display unit 151 may display one or more of the measured facial state information and the determined manufacturing type of the cosmetic.
- the cosmetic accommodating part 160 may include a plurality of cartridges 161 to 165. Each of the plurality of cartridges 161 to 165 may receive different cosmetic component elements.
- the cosmetic ingredient element may be a syrup solution in the form of syrup.
- Each of the plurality of cartridges 161 to 165 may be disposed inside the body case 103.
- the plurality of cartridges 161 to 165 may be separated from the body case 103. That is, it is removable.
- the plurality of cartridges 161 to 165 may be provided in a capsule form.
- the user can put the desired component of the cosmetic into the body case 103 by replacing the cartridge as necessary.
- the discharge unit 190 may include a plurality of nozzles corresponding to each of the plurality of cartridges 161 to 165.
- the plurality of component elements pertaining to the determined production type may be ejected through the plurality of nozzles.
- each of the plurality of nozzles constituting the discharge unit 190 may be attached to each of the plurality of cartridges.
- 10 to 12 are views for explaining a portable type cosmetic manufacturing apparatus according to an embodiment of the present invention.
- FIG. 10 is a perspective view of the portable type cosmetic manufacturing apparatus 100-2
- FIG. 11 is a bottom view of the portable type cosmetic manufacturing apparatus 100-2
- FIG. 12 is a portable type cosmetic manufacturing apparatus 100-2. ) Is an exploded perspective view.
- the portable type cosmetic manufacturing apparatus 100-2 may include all the components shown in FIG. 3.
- the portable type cosmetic manufacturing apparatus 100-2 may include an upper case 1010 and a lower case 1030.
- the upper case 1010 and the lower case 1030 may be fastened.
- the display unit 151 may be mounted on one surface of the upper case 1010.
- the display unit 151 may display one or more of the measured facial state information and the determined manufacturing type of the cosmetic.
- the lower case 1030 may accommodate the plurality of cartridges 1201 to 1207.
- the lower case 1030 may include a plurality of grooves 1211 to 1217 to accommodate the plurality of cartridges 1201 to 1207.
- FIG. 13 is a view for explaining the principle of discharging cosmetic products according to the determined cosmetic production type.
- One sub driver 130-1 may correspond to one cartridge 161.
- the cartridge 161 may include a component element case 161a and a syrup solution 161b to receive the component elements.
- the driving unit 130 of the cosmetic manufacturing apparatus 100 may include an actuator 131, a connecting rod 133, and a plunger 135.
- the actuator 131 may apply pressure to the plunger 135 via the connecting rod 133.
- the actuator 131 may serve as a dispenser for quantitatively controlling the components of cosmetics and discharging some syrup solutions.
- the driving unit 130 may include two sub-driving units. One sub driver may position through a linear motion, and the other sub driver may perform dispensing.
- the actuator 131 may apply the pressure included in the driving signal received from the processor 180 to the plunger 135.
- the connecting rod 133 may connect between the actuator 131 and the plunger 135.
- the connecting rod 133 may connect between the central axis of the actuator 131 and the central axis of the plunger 135.
- the plunger 135 may apply a predetermined pressure to the syrup solution 161b through the actuator air conditioner 131 and the connecting rod 133.
- syrup solution 1310 may be discharged through the nozzle 191.
- 14A to 14D are diagrams for explaining an example of providing information on facial skin care of a user through a cosmetic manufacturing apparatus according to an embodiment of the present invention.
- 14A to 14D may be screens that may be displayed according to execution of a skin care application installed in the cosmetic manufacturing apparatus 100.
- the cosmetic manufacturing apparatus 100 may display a photographing guide screen 1410 on the display unit 151 for measuring the skin condition of the face through the camera 121.
- the cosmetic manufacturing apparatus 100 may display the skin care menu screen 1420 on the display unit 151.
- the skin care menu screen 1420 may include a skin care item 1421 for each region, a tutorial item 1423, and a past history item 1425.
- the skin care item 1421 for each area region may be an item for providing a manufacturing type of the cosmetic to be discharged based on the skin condition of each of the plurality of area regions constituting the face through the captured face image.
- the tutorial item 1423 may be an item for teaching a method of using a skin care application.
- the past history item 1425 may be an item for providing a facial skin condition measured in the past.
- the cosmetic manufacturing apparatus 100 may provide a manufacturing type screen 1430 that provides a manufacturing type of cosmetics based on the skin condition of each part of the measured face, as shown in FIG. 14C. Can be displayed.
- the manufacturing type screen 1430 may include a compounding ratio of the component elements for each part region of the face.
- the cosmetic manufacturing apparatus 100 may display a guide video that guides how to use the skin care application.
- the present invention described above can be embodied as computer readable codes on a medium on which a program is recorded.
- the computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. There is this.
- the computer may also include a processor 180 of an artificial intelligence device.
Landscapes
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Accounting & Taxation (AREA)
- General Health & Medical Sciences (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Manufacturing & Machinery (AREA)
- Biomedical Technology (AREA)
- Primary Health Care (AREA)
- Development Economics (AREA)
- Human Resources & Organizations (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Tourism & Hospitality (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Cosmetics (AREA)
Abstract
Selon un mode de réalisation de l'invention, un dispositif de fabrication de produits cosmétiques peut comprendre : une unité d'entraînement comprenant un ou plusieurs moteurs ; et un processeur permettant d'acquérir des données d'état cutané pour chaque zone d'une pluralité de zones partielles constituant le visage d'un utilisateur ainsi que des données du milieu environnant, d'utiliser les données d'état cutané et les données du milieu environnant pour déterminer le type de fabrication de produits cosmétiques, et d'amener l'unité d'entraînement à fabriquer les produits cosmétiques selon le type de fabrication déterminé et les distribuer.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2018-0095134 | 2018-08-14 | ||
KR20180095134 | 2018-08-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020036345A1 true WO2020036345A1 (fr) | 2020-02-20 |
Family
ID=69525588
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/009540 WO2020036345A1 (fr) | 2018-08-14 | 2019-07-31 | Dispositif de fabrication de produits cosmétiques |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020036345A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160061978A (ko) * | 2013-07-22 | 2016-06-01 | 더 락커펠러 유니버시티 | 피부병의 광학 검출 |
JP6243008B2 (ja) * | 2013-04-09 | 2017-12-06 | イーエルシー マネージメント エルエルシー | 肌診断及び画像処理方法 |
KR20180018486A (ko) * | 2015-06-15 | 2018-02-21 | 하임 아미르 | 적응적 피부 처치를 위한 시스템과 방법 |
KR20180020609A (ko) * | 2016-08-19 | 2018-02-28 | 하성아 | 휴대용 멀티미디어 기기를 이용한 가정용 화장품 제조기 |
WO2018101572A1 (fr) * | 2016-12-01 | 2018-06-07 | 주식회사 엘지생활건강 | Système de fourniture de produits cosmétiques personnalisés et procédé de fonctionnement associé |
-
2019
- 2019-07-31 WO PCT/KR2019/009540 patent/WO2020036345A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6243008B2 (ja) * | 2013-04-09 | 2017-12-06 | イーエルシー マネージメント エルエルシー | 肌診断及び画像処理方法 |
KR20160061978A (ko) * | 2013-07-22 | 2016-06-01 | 더 락커펠러 유니버시티 | 피부병의 광학 검출 |
KR20180018486A (ko) * | 2015-06-15 | 2018-02-21 | 하임 아미르 | 적응적 피부 처치를 위한 시스템과 방법 |
KR20180020609A (ko) * | 2016-08-19 | 2018-02-28 | 하성아 | 휴대용 멀티미디어 기기를 이용한 가정용 화장품 제조기 |
WO2018101572A1 (fr) * | 2016-12-01 | 2018-06-07 | 주식회사 엘지생활건강 | Système de fourniture de produits cosmétiques personnalisés et procédé de fonctionnement associé |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020141924A1 (fr) | Appareil et procédé de génération de données cartographiques d'espace de nettoyage | |
WO2019164374A1 (fr) | Dispositif électronique et procédé de gestion d'objet personnalisé basé sur un avatar | |
WO2018117704A1 (fr) | Appareil électronique et son procédé de fonctionnement | |
WO2018143630A1 (fr) | Dispositif et procédé de recommandation de produits | |
WO2020241930A1 (fr) | Procédé d'estimation d'emplacement à l'aide de capteurs multiples et robot de mise en œuvre de ceux-ci | |
WO2018139865A1 (fr) | Robot mobile | |
WO2019151735A1 (fr) | Procédé de gestion d'inspection visuelle et système d'inspection visuelle | |
WO2013085193A1 (fr) | Appareil et procédé pour améliorer la reconnaissance d'un utilisateur | |
WO2020130747A1 (fr) | Appareil et procédé de traitement d'image pour transformation de style | |
WO2019059505A1 (fr) | Procédé et appareil de reconnaissance d'objet | |
WO2019124963A1 (fr) | Dispositif et procédé de reconnaissance vocale | |
WO2019031825A1 (fr) | Dispositif électronique et procédé de fonctionnement associé | |
WO2019125029A1 (fr) | Dispositif électronique permettant d'afficher un objet dans le cadre de la réalité augmentée et son procédé de fonctionnement | |
WO2020262746A1 (fr) | Appareil à base d'intelligence artificielle pour recommander un parcours de linge, et son procédé de commande | |
WO2019108028A1 (fr) | Dispositif portable de mesure de l'état de la peau et système de diagnostic et de gestion de l'état de la peau | |
WO2015199288A1 (fr) | Terminal du type lunettes, et procédé de commande de ce terminal | |
WO2023018285A1 (fr) | Procédé et dispositif de maquillage virtuel d'intelligence artificielle utilisant une reconnaissance d'image multi-angle | |
WO2020184736A1 (fr) | Appareil de nettoyage à intelligence artificielle et son procédé de fonctionnement | |
WO2021006482A1 (fr) | Appareil et procédé de génération d'image | |
WO2018117753A1 (fr) | Dispositif électronique et procédé de commande associé | |
WO2018117538A1 (fr) | Procédé d'estimation d'informations de voie et dispositif électronique | |
WO2020098013A1 (fr) | Procédé de recommandation de programme télévisé, terminal, système et support de stockage | |
WO2020153785A1 (fr) | Dispositif électronique et procédé pour fournir un objet graphique correspondant à des informations d'émotion en utilisant celui-ci | |
EP3545685A1 (fr) | Procédé et appareil de filtrage de vidéo | |
WO2019088338A1 (fr) | Dispositif électronique et procédé de commande associé |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19849815 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19849815 Country of ref document: EP Kind code of ref document: A1 |