+

CN111759203A - Intelligent face cleaning and skin care method, face cleaning and skin care instrument and computer readable storage medium - Google Patents

Intelligent face cleaning and skin care method, face cleaning and skin care instrument and computer readable storage medium Download PDF

Info

Publication number
CN111759203A
CN111759203A CN202010552906.XA CN202010552906A CN111759203A CN 111759203 A CN111759203 A CN 111759203A CN 202010552906 A CN202010552906 A CN 202010552906A CN 111759203 A CN111759203 A CN 111759203A
Authority
CN
China
Prior art keywords
skin
face
user
area
cleaning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010552906.XA
Other languages
Chinese (zh)
Inventor
赵斌
邓艳桃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jingxiangxin Electronics Co ltd
Original Assignee
Shenzhen Jingxiangxin Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jingxiangxin Electronics Co ltd filed Critical Shenzhen Jingxiangxin Electronics Co ltd
Priority to CN202010552906.XA priority Critical patent/CN111759203A/en
Publication of CN111759203A publication Critical patent/CN111759203A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47KSANITARY EQUIPMENT NOT OTHERWISE PROVIDED FOR; TOILET ACCESSORIES
    • A47K7/00Body washing or cleaning implements
    • A47K7/04Mechanical washing or cleaning devices, hand or mechanically, i.e. power operated

Landscapes

  • Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses an intelligent face cleaning and skin care method, which comprises the following steps: the method comprises the steps of connecting with a preset intelligent terminal, obtaining at least one face image of a user, inputting the at least one face image into a pre-trained neural network, and obtaining a skin analysis result of the user; acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing an area with the same skin feature in the face area of the user into one skin area, and acquiring at least one skin area; evaluating the severity of the skin characteristics corresponding to each skin area, and setting a face cleaning and/or skin care mode of each skin area according to the evaluation result; acquiring a current contact position, determining a skin area corresponding to the current contact position, and cleaning and/or caring the skin by using a cleaning and caring mode corresponding to the skin area. The invention also discloses a face cleaning and skin caring instrument and a computer readable storage medium. The invention can more reasonably clean the face and/or care the skin of a user and improve the effect of cleaning the face and/or care the skin.

Description

Intelligent face cleaning and skin care method, face cleaning and skin care instrument and computer readable storage medium
Technical Field
The invention relates to the technical field of skin care, in particular to an intelligent face cleaning and skin care method, an intelligent face cleaning instrument and a computer readable storage medium.
Background
With the improvement of living standard of people, more and more people use the intelligent face cleaning instrument to carry out daily face cleaning and/or skin care. The skin of different parts of each face is different, when the intelligent face cleaning instrument is used for cleaning and/or caring the face, parameters such as time, strength, temperature and the like required to be cleaned are different, while the existing face cleaning and caring instrument on the market only has one cleaning and/or caring mode, cannot execute different cleaning and/or caring methods according to different skin conditions of individuals, and can cause the situations of over-cleaning/caring or improper cleaning/caring.
Disclosure of Invention
Based on this, it is necessary to provide an intelligent facial cleansing and skin care method, an intelligent facial cleanser and a computer readable storage medium for solving the above problems.
A method for intelligent face cleaning and skin care, comprising the following steps: the method comprises the steps of connecting with a preset intelligent terminal, obtaining at least one face image of a user, inputting the at least one face image into a pre-trained neural network, and obtaining a skin analysis result of the user; acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing an area with the same skin feature in the face area of the user into one skin area, and acquiring at least one skin area of the face of the user; evaluating the severity of the skin characteristics corresponding to each skin area, and setting a facial cleaning and/or skin care mode corresponding to each skin area according to the evaluation result; acquiring a current contact position, determining the skin area corresponding to the current contact position, and cleaning and/or caring the skin by using a cleaning and caring mode corresponding to the skin area.
A facial cleansing and skin care device comprising: the skin analysis module is used for being connected with a preset intelligent terminal, acquiring at least one face image of a user and acquiring a skin analysis result of the user according to the at least one face image; the region dividing module is used for acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing a region with the same skin feature in the face region of the user into one skin region, and acquiring at least one skin region of the face of the user; the setting mode module is used for evaluating the severity of the skin characteristics corresponding to each skin area and setting a facial cleaning and skin care mode corresponding to each skin area according to the evaluation result; and the face cleaning and skin care module is used for acquiring the current position, determining the skin area corresponding to the current position and using a face cleaning and skin care mode corresponding to the skin area to clean and care the face.
A facial cleansing and skin care device comprising: a processor coupled to the memory and a memory having a computer program stored therein, the processor executing the computer program to implement the method as described above.
A computer-readable storage medium storing a computer program executable by a processor to implement a method as described above.
The embodiment of the invention has the following beneficial effects:
the method comprises the steps of obtaining a skin analysis result of a user according to at least one face image of the user, dividing a face area of the user into at least one skin area according to the skin analysis result, setting a face cleaning and/or skin care mode corresponding to each skin area, determining the skin area corresponding to a current contact position, and cleaning and/or caring the face by using the face cleaning and/or skin care mode corresponding to the skin area.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Wherein:
FIG. 1 is a schematic flow chart of a first embodiment of the intelligent facial skin care method provided by the present invention;
FIG. 2 is a schematic flow chart of a second embodiment of the intelligent facial cleansing and skin care method provided by the present invention;
FIG. 3 is a schematic flow chart of a third embodiment of the intelligent facial cleansing and skin care method provided by the present invention;
FIG. 4 is a schematic flow chart of a fourth embodiment of the intelligent facial cleansing and skin care method provided by the present invention;
FIG. 5 is a schematic flow chart of a fifth embodiment of the intelligent facial cleansing and skin care method provided by the present invention;
FIG. 6 is a schematic structural view of a first embodiment of the facial skin care device provided by the present invention;
FIG. 7 is a schematic structural view of a second embodiment of the facial skin care device provided by the present invention;
FIG. 8 is a schematic structural diagram of an embodiment of a computer-readable storage medium provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of an intelligent face-cleaning and skin-care method provided by the present invention. The intelligent face cleaning and skin care method provided by the invention comprises the following steps:
s101: and the method comprises the steps of connecting with a preset intelligent terminal, obtaining at least one face image of a user, and inputting the at least one face image into a pre-trained neural network to obtain a skin analysis result of the user.
In a specific implementation scenario, the face cleaning and skin care instrument is connected with a preset intelligent terminal before face cleaning and/or skin care, and at least one face image of a user is acquired. For example, the preset intelligent terminal may be a mobile phone, a tablet, a computer, or the like of the user, and acquires at least one face image of the user stored in the preset intelligent terminal. Because the face image of the user needs to be analyzed, and the images in the preset intelligent terminal are not necessarily all face images, such as full-body photographs, group photographs and the like, at least one person image in the preset intelligent terminal can be obtained, face recognition is performed on the at least one person image, the person image without the user is deleted, the user image is obtained, the user image is screened, users with excessive beauty, too low pixels or poor photo quality are removed, the image to be processed is obtained, the image to be processed is cut, and the at least one face image of the user is obtained. Further, the at least one face image may be resized, such as by scaling, so that the at least one face image has the same enlarged scale as the face and/or the at least one face image has the same size.
And inputting at least one facial image into a pre-trained neural network to obtain a skin analysis result of the user. Further, due to differences of factors such as the race, age, gender and the like, the skin of the user is greatly different, for example, pores of males are generally thicker than that of females, and wrinkles of old people are generally more than that of young people, so that samples, such as a sample of a female in middle-aged asia, a sample of a male in old northern europe and the like, can be collected in a targeted manner, the neural networks can be trained according to the samples, pre-trained neural networks with different pertinence, such as a neural network of a female in middle-aged asia, can be obtained, basic information of the user, such as the race, age, gender and the like, at least one face image of the user is input into the corresponding pre-trained neural network, and a skin analysis result of the user is obtained.
In the present implementation scenario, the skin analysis results include analysis results of various regions of the user's face. For example, the user may have a large volume of forehead oil, a dry cheek area, a large pore in the nose, acne on the chin, and blotches on the cheekbone. Alternatively, the area coordinates corresponding to different skin types may be given, for example, the area coordinate of oily skin is a, the area coordinate of dry skin is B, the area coordinate of coarse pores is C, the area coordinate of acne is D, and the area coordinate of spot area is E. It should be noted that the area coordinates may overlap with each other, for example, the area coordinate E and the area coordinate B partially overlap, and the area coordinate C, the area coordinate D and the area coordinate a at least partially overlap.
In other implementations, the at least one facial image may also be image processed. For example, at least one face image may be subjected to binarization processing to obtain at least one binary pattern, discrete points in each binary image are used as pore features to obtain a pore distribution image, and the number of pores in the pore distribution image and the size of each pore are calculated to obtain a pore size region and corresponding region coordinates thereof.
S102: and acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing an area with the same skin feature in the face area of the user into one skin area, and acquiring at least one skin area of the face of the user.
In the present implementation scenario, the face area of the user is divided into at least one skin area according to the skin analysis result, for example, the face area of the user is divided into at least one skin area according to the area coordinates corresponding to different skins in the skin analysis result. For example, if the face area of the user has five skin characteristics, the area having the first skin characteristic is divided into a first skin area with area coordinates a, the area having the second skin characteristic is divided into a second skin area with area coordinates B, the area having the third skin characteristic is divided into a third skin area with area coordinates C, the area having the fourth skin characteristic is divided into a fourth skin area with coordinates D, and the area having the fifth skin characteristic is divided into a fifth skin area with coordinates E.
Or the forehead area is used as a first skin area, two cheeks are used as a second skin area, the nose is used as a third skin area, the chin is used as a fourth skin area, and the cheekbones are used as a fifth skin area.
Specifically, the face area of the user may be divided into a plurality of unit areas, the skin characteristic corresponding to each unit area may be determined, and the unit areas having the same skin characteristic may be divided into the same skin area. Furthermore, each unit area may correspond to a plurality of skin characteristics, for example, both the oily skin and the coarse pores skin, the picture of the unit area may be subjected to near analysis to obtain the skin characteristics with more serious problems in the plurality of skin characteristics corresponding to the unit area, for example, the oil output condition of the unit area is measured, for example, the coarse pores in the unit area are measured according to the light reflection degree of the picture of the unit area, for example, the diameter of the pores is measured and/or the number of the pores is calculated, and the unit area is regarded as the oily skin area if the problem of the oily skin in the unit area is more serious through analysis. Alternatively, the unit area may be used as both an oily skin area and a pore-forming skin area.
S103: and evaluating the severity of the skin characteristics corresponding to each skin area, and setting a facial cleaning and/or skin care mode corresponding to each skin area according to the evaluation result.
In the present implementation scenario, a corresponding facial cleansing and/or skin care mode is set for each skin type region. For example, the first skin area is oily skin, the time required for face cleaning is longer, and the end part of the instrument in contact with the face vibrates at high frequency and small amplitude, so that more face cleaning foam is generated, and a better face cleaning effect is achieved. In skin care, the first skin area may be illuminated with blue light (415nm ± 10nm) corresponding to the oil skin. As another example, the second skin area is dry and requires less time to clean the face, and the end of the device that contacts the face vibrates with low frequency and low amplitude to avoid over-cleaning. The fifth skin area is a spot area, and a radio frequency method can be adopted to remove spots in the spot area during skin care. The fourth skin area is a pore-enlarged area, which can reduce the temperature of the instrument end in contact with the face when cleaning and/or caring the skin, so that the pores are shrunk.
Further, the severity evaluation is performed on the skin characteristics corresponding to each skin area, for example, the partial image corresponding to each skin area may be input into a pre-trained neural network to obtain the corresponding severity evaluation result, for example, the forehead area is oily skin, the oily degree is second grade, the two cheek areas are dry skin, the dry degree is first grade, the nose area is skin with large pores, the coarse degree is third grade, the chin area is acne skin, the acne degree is first grade, the cheek bone area is spot skin, and the spot degree is second grade. The larger the rating, the worse the skin.
Different face cleaning and/or skin care modes are set according to the severity degree corresponding to each skin characteristic. For example, the selected facial cleansing and/or skin care mode of other users with the same and/or similar skin characteristics in the database can be obtained, and the facial cleansing and/or skin care mode corresponding to the skin characteristic area can be set according to the personal use habits of the users. For example, if the facial cleansing mode selected by the other user is facial cleansing duration 20s and facial cleansing intensity is moderate, or facial cleansing duration 15s and facial cleansing intensity is severe, and the user's usage habit is facial cleansing intensity, the facial cleansing mode is set to facial cleansing duration 15s and facial cleansing intensity is severe to match the user's usage habit. Or the use habit of the user is mild face cleaning strength, the face cleaning mode is set to be face cleaning time duration of 20s and moderate face cleaning strength so as to be close to the use habit of the user.
In other implementation scenarios, the severity of the skin characteristics of other users and the corresponding facial cleansing and/or skin care modes can be obtained, and used for training the neural network, inputting the severity of each skin characteristic of the user into the trained neural network, and setting the corresponding facial cleansing and/or skin care modes according to the output of the neural network.
S104: acquiring a current contact position, determining a skin area corresponding to the current contact position, and cleaning and/or caring the face by using a cleaning and/or caring mode corresponding to the skin area.
In the present embodiment, when cleaning and/or caring skin, the current contact position is acquired by the infrared sensor, for example, the distance and orientation relative to the nose tip can be acquired, and the coordinates of the current contact position in the face area can be deduced according to the preset coordinates of the nose tip. And determining a skin area corresponding to the current contact position according to the coordinates, and cleaning and/or caring the skin by using a cleaning and caring mode corresponding to the skin area. For example, if the current contact position is determined to belong to the first skin area, the face cleaning and/or skin care is performed by adopting the face cleaning and/or skin care mode corresponding to the first skin area: the face cleaning time is long, the end part of the instrument contacting with the face vibrates in high frequency and small amplitude, and blue light (415nm +/-10 nm) is adopted for irradiation when skin care is carried out. For another example, if it is determined that the current contact position belongs to the third skin area, the face cleaning and/or skin care is performed by using the face cleaning and/or skin care mode corresponding to the third skin area: the time for cleaning face is longer, the instrument end part contacting with face vibrates greatly at high frequency, the temperature of the instrument end part is reduced, and the temperature of the instrument end part is reduced for cold compress when skin care is carried out.
As can be seen from the above description, in this embodiment, a skin analysis result of a user is obtained according to at least one facial image of the user, a facial region of the user is divided into at least one skin region according to the skin analysis result, a cleansing and/or skin care mode corresponding to each skin region is set, the skin region corresponding to the current contact position is determined, and the cleansing and/or skin care mode corresponding to the skin region is used for cleansing and/or skin care, so that different cleansing and/or skin care modes can be adopted for different facial regions according to the skin condition of each user, thereby more reasonably cleansing and/or skin care for the user, and improving the effects of cleansing and/or skin care.
Referring to fig. 2, fig. 2 is a schematic flow chart of a second embodiment of the intelligent face cleansing and skin care method provided by the present invention. The intelligent face cleaning and skin care method provided by the invention comprises the following steps:
s201: and the method comprises the steps of connecting with a preset intelligent terminal, obtaining at least one face image of a user, and inputting the at least one face image into a pre-trained neural network to obtain a skin analysis result of the user.
In a specific implementation scenario, step S201 is substantially the same as step S101 in the first embodiment of the intelligent face cleaning and skin care method provided by the present invention, and details thereof are not repeated here.
S202: acquiring the shooting time and the skin analysis result of at least one face image, and judging whether the skin of the user is improved or not according to the shooting time and the skin analysis result. If so, the process is terminated, otherwise, step S203 is executed.
In this embodiment, the shooting time of at least one face image is acquired, and the shooting time of the photograph is stored in the Exif (Exchangeable image file format) information at the time of photograph shooting. The shooting time of the to-be-processed image corresponding to each face image can be acquired as the shooting time of each face image. Inputting each face image into a pre-trained neural network, obtaining a skin analysis result of each face image, and specifically, ranking the skin of each region of the face of the user, for example, a forehead region is oily skin, the oily degree is second grade, two cheek regions are dry skin, the dry degree is first grade, a nose region is skin with coarse pores, the coarse degree is third grade, a chin region is acne skin, the acne degree is first grade, a cheekbone region is spot skin, and the spot degree is second grade. The larger the rating, the worse the skin.
And judging whether the skin of the user is improved or not by combining the shooting time of each face image and the skin analysis result of each face image. For example, the first face image was captured at 1 month and 1 day in 2020, and the skin analysis result showed that the head region was oily skin, the degree of oiliness was second grade, the two cheek regions were dry skin, the degree of dryness was first grade, the nose region was skin with large pores, the degree of coarseness was third grade, the chin region was acne skin, the degree of acne was second grade, the cheekbone region was spot skin, and the spot degree was second grade. The time of the second face image was 2020, 2 months and 1 day, the skin analysis result showed that the forehead area was oily skin, the oil level was first level, the two cheek areas were dry skin, the dry level was second level, the nose area was skin with large pores, the coarse level was third level, the chin area was acne skin, the acne level was first level, the cheekbone area was spot skin, and the spot level was second level.
According to the comparison, the skin quality was improved in the forehead area and chin area, and the skin quality was not improved in the remaining areas.
S203: and acquiring the skin area without improvement, and adjusting the face cleaning and/or skin care mode corresponding to the skin area without improvement according to the skin analysis result.
In the present embodiment, skin areas where no improvement occurs, such as the cheek areas, the nose area, and the cheekbone area, are obtained, and it can be inferred from the results of the skin analysis that the reason why no improvement occurs is that the cleansing power is too strong, and therefore the cleansing and/or skin care modes corresponding to these skin areas where no improvement occurs are adjusted in a targeted manner, the cleansing time in the cleansing mode corresponding to these areas can be shortened, or the vibration frequency and vibration amplitude of the instrument end portion in contact with the face at the time of cleansing can be reduced, the temperature of the instrument end portion in contact with the face at the time of skin care can be reduced, the radio frequency intensity of irradiation can be adjusted, and so on.
As can be seen from the above description, in this embodiment, whether the skin of the user is improved is determined according to the capturing time and the skin analysis result of at least one facial image, and if the skin of the user is not improved, the cleansing and/or skin care mode corresponding to the skin area where the improvement is not generated is adjusted according to the skin analysis result, so that the cleansing and/or skin care effect can be further improved, and the user satisfaction can be improved.
Referring to fig. 3, fig. 3 is a schematic flow chart of a third embodiment of the intelligent face cleansing and skin care method provided by the present invention. The intelligent face cleaning and skin care method provided by the invention comprises the following steps:
s301: and the method comprises the steps of connecting with a preset intelligent terminal, obtaining at least one face image of a user, and inputting the at least one face image into a pre-trained neural network to obtain a skin analysis result of the user.
S302: and acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing an area with the same skin feature in the face area of the user into one skin area, and acquiring at least one skin area of the face of the user.
S303: and evaluating the severity of the skin characteristics corresponding to each skin area, and setting a facial cleaning and/or skin care mode corresponding to each skin area according to the evaluation result.
In a specific implementation scenario, steps S301 to S303 are substantially the same as steps S101 to S103 in the first embodiment of the intelligent face-cleaning and skin-care method provided by the present invention, and are not described herein again.
S304: acquiring personal information of a user, wherein the personal information comprises at least one of sex, age, height, weight, work and rest habits, eating habits, physical information and physical examination information.
In the implementation scenario, the personal information of the user can be acquired from the preset intelligent terminal, and the personal information of the user can also affect the skin of the user, for example, people with a heavy eating habit have skin with the problems of wrinkles, spots, acne and the like, and need to be prevented in advance when cleaning and/or caring the skin, and the eating habit of the user can be acquired from a takeout APP in the preset intelligent terminal commonly used by the user. People with irregular work and rest habits easily suffer from oil and acne, need to be prevented in advance when cleaning and/or caring the skin, and the work and rest habits can be obtained from healthy APP in a preset intelligent terminal. The physical information and the physical examination information of the user can be automatically input by the user or a physical examination report of each physical examination of the user and a diagnosis and treatment record of the user for seeing a doctor are acquired, and the physical information and the physical examination information of the user are acquired from the physical examination report and the diagnosis and treatment record. For example, people with high blood sugar are easy to have acne, people with damp-heat constitution are easy to have acne, and the acne can be prevented in advance. In addition, the gender, age, height and weight of the user can also influence the skin quality of the user, and the skin quality problems which are often caused by the population with the gender, age, height and weight close to or the same as those of the user can be obtained by combining with big data analysis, so that the user needs to be prevented in advance when cleaning and/or caring the skin.
S305: and acquiring the environmental information of the current day, wherein the environmental information comprises at least one of weather, air temperature, air humidity, ultraviolet intensity and PM2.5 index.
In the present implementation scenario, the cleansing and/or skin care is part of the daily life of people, and the daily environmental information also has an effect on the skin quality of the user, for example, when the temperature is high, the skin is more prone to oil, and the cleansing strength needs to be enhanced, for example, the cleansing time is prolonged. When the humidity of the air is low, the skin is more easily dried, and the cleansing power needs to be reduced, for example, the cleansing time is shortened. When the ultraviolet intensity is high, the skin is easily sunburned, and it is necessary to reduce the cleaning force, for example, shorten the cleansing time and lower the temperature of the instrument tip portion which is in contact with the face. At higher PM2.5 indices, acne is likely to appear on the skin and increased cleansing is desirable, for example, to prolong the duration of cleansing. The environmental information of the day can be acquired through the preset intelligent terminal, for example, the environmental information is acquired from a weather APP.
S306: and adjusting the face cleaning and/or skin care mode corresponding to the at least one skin area according to the environment information and/or the personal information.
In this implementation scenario, the facial cleansing and/or skin care mode of at least one skin area is adjusted based on the environmental and/or personal information obtained in the previous step. The face cleaning and/or skin care mode of all the skin regions may be adjusted, or only the face cleaning and/or skin care mode of a partial region may be adjusted.
S307: acquiring a current contact position, determining a skin area corresponding to the current contact position, and cleaning and/or caring the skin by using a cleaning and caring mode corresponding to the skin area.
In this implementation scenario, step S307 is substantially the same as step S104 in the first embodiment of the intelligent face cleaning and skin care method provided by the present invention, and details thereof are not repeated here.
S308: recording the face cleaning and/or skin care track of the user, and judging whether the face cleaning and/or skin care track covers the whole skin area. If not, go to step S308. If yes, the process is ended.
In the implementation scenario, each time the current contact position is acquired, the acquired current contact position is recorded, and the facial cleaning and/or skin care trajectory of the user is generated according to the recorded current contact position. It is determined whether the cleanser and/or skin care track covers the entire area of the skin. Further, it may be detected whether the end of the instrument in contact with the face is away from the face for more than a preset length of time, and if so, it is determined whether the facial cleansing and/or skin care trajectory covers the entire skin area.
S309: the user is prompted for areas of skin that are not yet covered.
In the implementation scenario, the user may be prompted by means of vibration, flashing of an indicator light, voice, or the like, of the area of the skin that is not yet covered.
As can be seen from the above description, in this embodiment, the cleansing and/or skin care mode is adjusted according to the environmental information and/or the personal information of the user, and when the cleansing and/or skin care trajectory does not cover the entire skin area, the user is prompted about the skin area that is not covered yet, so that the result of cleansing and/or skin care can be effectively improved.
Referring to fig. 4, fig. 4 is a schematic flow chart of a fourth embodiment of the intelligent face cleansing and skin care method provided by the present invention. The intelligent face cleaning and skin care method provided by the invention comprises the following steps:
s401: and the method comprises the steps of connecting with a preset intelligent terminal, obtaining at least one face image of a user, and inputting the at least one face image into a pre-trained neural network to obtain a skin analysis result of the user.
S402: and acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing an area with the same skin feature in the face area of the user into one skin area, and acquiring at least one skin area of the face of the user.
S403: and evaluating the severity of the skin characteristics corresponding to each skin area, and setting a facial cleaning and/or skin care mode corresponding to each skin area according to the evaluation result.
In a specific implementation scenario, steps S401 to S403 are substantially the same as steps S101 to S103 in the first embodiment of the intelligent face-cleaning and skin-care method provided by the present invention, and are not described herein again.
S404: the method comprises the steps of obtaining a current contact position, obtaining a face contour of a user according to at least one face image, determining a contour angle corresponding to the current contact position according to the face contour, and adjusting an operation angle of an instrument end part according to the contour angle to enable the instrument end part to be attached to the face contour of the user.
In the present implementation scenario, the current contact position is acquired through the infrared sensor, for example, the distance and the orientation relative to the nose tip can be acquired, and the coordinates of the current contact position in the face area are deduced according to preset nose tip coordinates. And analyzing the at least one face image by adopting a preset algorithm to obtain the face outline of the user. For example, at least one facial image may be input into a pre-trained neural network to obtain a user's facial contour.
And determining a contour angle of the current contact position corresponding to the face contour, and adjusting the operation angle of the instrument end part in contact with the face according to the contour angle. For example, an initial angle of the instrument end portion contacting with the face can be acquired through a gyroscope in the facial cleansing and skin care instrument, an angle required to be adjusted is calculated according to the profile angle and the initial angle, and an operating angle of the instrument end portion is adjusted according to the profile angle, so that the instrument end portion fits the profile of the face of the user.
S405: and determining a skin area corresponding to the current contact position, and cleaning and/or caring the skin by using a cleaning and caring mode corresponding to the skin area.
In this implementation scenario, step S405 is substantially the same as step S104 in the first embodiment of the intelligent face cleaning and skin care method provided by the present invention, and is not repeated here.
As can be seen from the above description, in this embodiment, the face contour of the user is obtained according to at least one face image, the contour angle corresponding to the current contact position is determined according to the face contour, and the operation angle of the instrument end is adjusted according to the contour angle, so that the instrument end fits the face contour of the user, and the result of cleaning and/or caring skin can be effectively improved.
Referring to fig. 5, fig. 5 is a schematic flow chart of a fifth embodiment of the intelligent face-cleaning and skin-care method provided by the present invention. The intelligent face cleaning and skin care method provided by the invention comprises the following steps:
s501: and the method comprises the steps of connecting with a preset intelligent terminal, obtaining at least one face image of a user, and inputting the at least one face image into a pre-trained neural network to obtain a skin analysis result of the user.
S502: and acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing an area with the same skin feature in the face area of the user into one skin area, and acquiring at least one skin area of the face of the user.
S503: and evaluating the severity of the skin characteristics corresponding to each skin area, and setting a facial cleaning and/or skin care mode corresponding to each skin area according to the evaluation result.
In a specific implementation scenario, steps S501 to S503 are substantially the same as steps S101 to S103 in the first embodiment of the intelligent facial cleansing and skin care method provided by the present invention, and are not described herein again.
S504: and recording the use data of the user and uploading the use data to the cloud server.
In this implementation scenario, usage data of the user is recorded, the usage data including at least one skin area of the user, each skin area corresponding to a facial cleansing and/or skin care mode. The use data are uploaded to the cloud server, when the user changes a new face cleaning and skin care instrument, the use data can be downloaded from the cloud, face cleaning and/or skin care can be carried out on the user according to the use data, and the situation that the new face cleaning and skin care instrument needs to be used and parameters need to be reset can be avoided. Further, the usage data of the user may be used as a reference for the setting of the facial and/or skin care mode of other users.
S505: and recording a skin analysis result, comparing the skin analysis result with the previously recorded skin analysis result, and judging whether the skin of the user is improved. If so, go to step S507, otherwise go to step S506.
In the implementation scenario, the skin analysis result is recorded according to a preset period, or the skin analysis result obtained in each use is recorded. The method includes the steps of obtaining a skin analysis result in the previously uploaded use data, comparing the current skin analysis result with the previous analysis result, and judging whether the skin of the user is improved, wherein the specific judgment process is basically the same as the step of judging whether the skin of the user is improved in step S202 of the second embodiment of the intelligent face cleaning and skin caring method provided by the invention, and is not repeated here.
S506: and acquiring adjustment data of the face cleaning and/or skin care modes of other users, which are the same as or similar to the use data of the users, and adjusting the face cleaning and/or skin care modes corresponding to the skin type areas which are not improved according to the adjustment data.
In this implementation scenario, adjustment data of the facial cleansing and/or skin care mode of another user that is the same as or similar to the usage data of the user is obtained, for example, the other user nail has the same or similar usage data as the user, and the facial cleansing and/or skin care mode corresponding to the skin area of the user where no improvement occurs may be adjusted accordingly with reference to the adjustment made by the other user nail to the facial cleansing and/or skin care mode.
Specifically, adjustment data of the facial cleansing and/or skin care modes of a plurality of other users, which are the same as or similar to the use data of the user, can be acquired, and the facial cleansing and/or skin care modes corresponding to the skin type areas where no improvement occurs are adjusted according to a preset algorithm and the use habits of the user. For example, the adjustment data with the largest number of users, the adjustment data closest to the user habits or the adjustment data obtained by combining the adjustment data and the adjustment data are selected to correspond to the face cleaning and/or skin care mode corresponding to the skin area without improvement.
Furthermore, the usage data includes the skin area of the user without improvement, the adjustment data of other users having the same area with the user is obtained, and the facial cleaning and/or skin care mode corresponding to the skin area of the user without improvement can be adjusted accordingly with reference to the adjustment made by other users to the facial cleaning and/or skin care mode.
S507: acquiring a current contact position, determining a skin area corresponding to the current contact position, and cleaning and/or caring the skin by using a cleaning and caring mode corresponding to the skin area.
In this implementation scenario, step S507 is substantially the same as step S104 in the first embodiment of the intelligent face cleaning and skin care method provided by the present invention, and details thereof are not repeated here.
According to the above description, the use data of the user is recorded in the embodiment, and the use data is uploaded to the cloud server, so that when the user changes a new face cleaning and skin care instrument, the use data can be downloaded from the cloud, the user can clean the face and/or care the skin according to the use data, the situation that the new face cleaning and skin care instrument needs to be used for resetting parameters can be avoided, and the use complexity is reduced.
Referring to fig. 6, fig. 6 is a schematic structural view of a first embodiment of a facial skin care device according to the present invention. The present invention provides a face cleaning and skin care instrument 10 comprising: a skin analysis module 11, a region division module 12, a setting mode module 13 and a facial skin care module 14. The skin analysis module 11 is used for connecting with a preset intelligent terminal, acquiring at least one face image of the user, and acquiring a skin analysis result of the user according to the at least one face image. The region dividing module 12 is configured to obtain at least one skin feature of the face of the user according to the skin analysis result, divide a region having the same skin feature in the face region of the user into one skin region, and obtain at least one skin region of the face of the user. The setting mode module 13 is configured to perform severity evaluation on the skin characteristics corresponding to each skin area, and set a facial cleansing and skin care mode corresponding to each skin area according to an evaluation result. The face cleaning and skin care module 14 is configured to obtain a current position, determine a skin area corresponding to the current position, and perform face cleaning and skin care using a face cleaning and skin care mode corresponding to the skin area.
The skin analysis module 11 is further configured to obtain a shooting time and a skin analysis result of the at least one face image, and determine whether the skin of the user is improved according to the shooting time and the skin analysis result; and if the skin of the user is not improved, acquiring a skin area which is not improved, and adjusting a face cleaning and/or skin care mode corresponding to the skin area which is not improved according to the skin analysis result.
The setting mode module 13 is further configured to obtain personal information of the user, where the personal information includes at least one of gender, age, height, weight, work and rest habits, and eating habits;
acquiring the environmental information of the current day, wherein the environmental information comprises at least one item of weather, air temperature, air humidity, ultraviolet intensity and PM2.5 index; and adjusting the face cleaning and/or skin care mode corresponding to the at least one skin area according to the personal information and/or the environmental information.
The face cleaning and skin care module 14 is further configured to record a face cleaning and/or skin care trajectory of the user, and determine whether the face cleaning and/or skin care trajectory covers the entire skin area; if the facial cleansing and/or skin care track does not cover the entire skin area, the user is prompted for an area of skin that is not covered.
The face cleaning and skin care module 14 is further configured to obtain a face contour of the user according to the at least one face image, and determine a contour angle corresponding to the current contact position according to the face contour; and adjusting the operation angle of the end part of the instrument according to the contour angle, so that the end part of the instrument is attached to the facial contour of the user.
The facial cleansing and skin care module 14 is further configured to record usage data of the user, and upload the usage data to the cloud server, where the usage data includes at least one skin area, and each skin area corresponds to a facial cleansing and/or skin care mode.
As can be seen from the above description, the facial cleansing and skin caring apparatus in this embodiment obtains a skin analysis result of a user according to at least one facial image of the user, divides a facial region of the user into at least one skin region according to the skin analysis result, sets a facial cleansing and/or skin caring mode corresponding to each skin region, determines a skin region corresponding to a current contact position, and performs facial cleansing and/or skin caring using the facial cleansing and skin caring mode corresponding to the skin region, may adopt different facial cleansing and/or skin caring modes for different facial regions according to a skin condition of each user, and may more reasonably cleanse and/or care the skin of the user, and improve facial cleansing and/or skin caring effects.
Referring to fig. 7, fig. 7 is a schematic structural view of a second embodiment of a facial skin care device according to the present invention. The facial skin care device 20 includes a processor 21 and a memory 22. The processor 21 is coupled to a memory 22. The memory 22 has stored therein a computer program which is executed by the processor 21 in operation to implement the method as shown in fig. 1-5. The detailed methods can be referred to above and are not described herein.
As can be seen from the above description, the facial cleansing and skin caring apparatus in this embodiment obtains a skin analysis result of a user according to at least one facial image of the user, divides a facial region of the user into at least one skin region according to the skin analysis result, sets a facial cleansing and/or skin caring mode corresponding to each skin region, determines a skin region corresponding to a current contact position, and performs facial cleansing and/or skin caring using the facial cleansing and skin caring mode corresponding to the skin region, may adopt different facial cleansing and/or skin caring modes for different facial regions according to a skin condition of each user, and may more reasonably cleanse and/or care the skin of the user, and improve facial cleansing and/or skin caring effects.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an embodiment of a computer-readable storage medium according to the present invention. The computer-readable storage medium 30 stores at least one computer program 31, and the computer program 31 is used for being executed by a processor to implement the method shown in fig. 1 to 5, and the detailed method can be referred to above and is not described herein again. In one embodiment, the computer readable storage medium 30 may be a memory chip in a terminal, a hard disk, or other readable and writable storage tool such as a removable hard disk, a flash disk, an optical disk, or the like, and may also be a server or the like.
As can be seen from the above description, in this embodiment, the computer program in the storage medium may be configured to obtain a skin analysis result of the user according to at least one facial image of the user, divide the facial region of the user into at least one skin region according to the skin analysis result, set a cleansing and/or skin care mode corresponding to each skin region, determine the skin region corresponding to the current contact position, and perform cleansing and/or skin care by using the cleansing and/or skin care mode corresponding to the skin region.
Different from the prior art, the invention adopts different face cleaning and/or skin care modes for different facial areas according to the skin condition of each user, can more reasonably clean and/or care the face of the user, and improves the face cleaning and/or skin care effect.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present invention, and it is therefore to be understood that the invention is not limited by the scope of the appended claims.

Claims (10)

1. An intelligent face cleaning and skin care method is characterized by comprising the following steps:
the method comprises the steps of connecting with a preset intelligent terminal, obtaining at least one face image of a user, inputting the at least one face image into a pre-trained neural network, and obtaining a skin analysis result of the user;
acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing an area with the same skin feature in the face area of the user into one skin area, and acquiring at least one skin area of the face of the user;
evaluating the severity of the skin characteristics corresponding to each skin area, and setting a facial cleaning and/or skin care mode corresponding to each skin area according to the evaluation result;
acquiring a current contact position, determining the skin area corresponding to the current contact position, and cleaning and/or caring the skin by using a cleaning and caring mode corresponding to the skin area.
2. The method for intelligent facial skin care according to claim 1, wherein the step of evaluating the severity of the skin characteristics corresponding to each of the skin regions and setting the facial and/or skin care mode corresponding to each of the skin regions according to the evaluation result comprises:
recording the use data of a user, and uploading the use data to a cloud server, wherein the use data comprises the at least one skin area and a face cleaning and/or skin care mode corresponding to each skin area.
3. The intelligent facial skin care method according to claim 2, wherein the step of acquiring at least one facial image of the user and acquiring the skin analysis result of the user based on the at least one facial image is followed by:
acquiring the shooting time and the skin analysis result of the at least one face image, and judging whether the skin of the user is improved or not according to the shooting time and the skin analysis result; if the skin of the user is not improved, acquiring a skin area which is not improved, and adjusting a face cleaning and/or skin care mode corresponding to the skin area which is not improved according to the skin analysis result; and/or
Recording the skin analysis result, comparing the skin analysis result with the previously recorded skin analysis result, and judging whether the skin of the user is improved; and if the skin of the user is not improved, acquiring a skin area which is not improved, and adjusting a face cleaning and/or skin care mode corresponding to the skin area which is not improved according to the skin analysis result.
4. The method for intelligent facial skin care according to claim 3, wherein the step of adjusting the facial and/or skin care mode corresponding to the skin type area where no improvement occurs according to the skin type analysis result comprises:
and acquiring adjustment data of the face cleaning and/or skin care modes of other users which are the same as or similar to the use data of the users, and adjusting the face cleaning and/or skin care modes corresponding to the skin type areas which are not improved according to the adjustment data.
5. The intelligent face cleaning and skin care method according to claim 1, wherein the step of setting the face cleaning and/or skin care mode corresponding to each skin type area is followed by the steps of:
the method comprises the steps of obtaining personal information of a user, wherein the personal information comprises at least one item of sex, age, height, weight, work and rest habits, eating habits, constitution information and physical examination information, obtaining environment information of the day, wherein the environment information comprises at least one item of weather, air temperature, air humidity, ultraviolet intensity and PM2.5 index, and adjusting a face cleaning and/or skin care mode corresponding to at least one skin area according to the personal information and/or the environment information.
6. The intelligent face cleaning and skin care method according to claim 1, wherein the step of face cleaning and/or skin care by using the face cleaning and/or skin care mode corresponding to the skin-type area is followed by the steps of:
recording a face cleaning and/or skin care track of a user, and judging whether the face cleaning and/or skin care track covers the whole skin area;
if the facial cleansing and/or skin care track does not cover the entire skin area, prompting the user that the skin area is not covered.
7. The intelligent facial skin care method according to claim 1, wherein the step of facial cleansing using the facial skin care mode corresponding to the skin-shaped area comprises:
acquiring a face contour of a user according to the at least one face image, and determining a contour angle corresponding to the current contact position according to the face contour;
and adjusting the operation angle of the end part of the instrument according to the contour angle, so that the end part of the instrument is attached to the face contour of the user.
8. A facial cleansing and skin care instrument is characterized by comprising:
the skin analysis module is used for being connected with a preset intelligent terminal, acquiring at least one face image of a user and acquiring a skin analysis result of the user according to the at least one face image;
the region dividing module is used for acquiring at least one skin feature of the face of the user according to the skin analysis result, dividing a region with the same skin feature in the face region of the user into one skin region, and acquiring at least one skin region of the face of the user;
the setting mode module is used for evaluating the severity of the skin characteristics corresponding to each skin area and setting a facial cleaning and skin care mode corresponding to each skin area according to the evaluation result;
and the face cleaning and skin care module is used for acquiring the current position, determining the skin area corresponding to the current position and using a face cleaning and skin care mode corresponding to the skin area to clean and care the face.
9. A facial cleansing and skin care instrument is characterized by comprising: a processor coupled to the memory and a memory having a computer program stored therein, the processor executing the computer program to implement the method of any of claims 1-7.
10. A computer-readable storage medium, in which a computer program is stored, the computer program being executable by a processor for implementing the method as claimed in any one of claims 1 to 7.
CN202010552906.XA 2020-06-17 2020-06-17 Intelligent face cleaning and skin care method, face cleaning and skin care instrument and computer readable storage medium Pending CN111759203A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010552906.XA CN111759203A (en) 2020-06-17 2020-06-17 Intelligent face cleaning and skin care method, face cleaning and skin care instrument and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010552906.XA CN111759203A (en) 2020-06-17 2020-06-17 Intelligent face cleaning and skin care method, face cleaning and skin care instrument and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN111759203A true CN111759203A (en) 2020-10-13

Family

ID=72722733

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010552906.XA Pending CN111759203A (en) 2020-06-17 2020-06-17 Intelligent face cleaning and skin care method, face cleaning and skin care instrument and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111759203A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112333439A (en) * 2020-10-30 2021-02-05 南京维沃软件技术有限公司 Face cleaning equipment control method and device and electronic equipment
CN112641436A (en) * 2020-11-05 2021-04-13 西安拾玖岁信息科技有限公司 Cosmetic method and device
CN113331792A (en) * 2021-07-02 2021-09-03 北京美医医学技术研究院有限公司 Skin care prompt system based on skin condition
CN113610844A (en) * 2021-08-31 2021-11-05 深圳市邻友通科技发展有限公司 Intelligent skin care method, device, equipment and storage medium
CN114027969A (en) * 2021-12-13 2022-02-11 上海澄镜科技有限公司 Intelligent radio frequency beauty instrument and working method
CN114873045A (en) * 2022-06-10 2022-08-09 广州数美生物科技有限公司 A skin care product storage box
CN116035527A (en) * 2022-12-29 2023-05-02 深圳市沃特沃德信息有限公司 Method, device, equipment and storage medium for full face coverage based on selected functions
CN118602522A (en) * 2024-06-27 2024-09-06 Tcl空调器(中山)有限公司 Air conditioner control method, device, air conditioner and computer readable storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106264523A (en) * 2015-06-03 2017-01-04 武汉朗立创科技有限公司 Skin protection suggesting method based on skin and environment measuring and system
CN106469302A (en) * 2016-09-07 2017-03-01 成都知识视觉科技有限公司 A kind of face skin quality detection method based on artificial neural network
CN107239671A (en) * 2017-06-27 2017-10-10 京东方科技集团股份有限公司 A kind of management method of skin condition, device and system
CN107330747A (en) * 2017-05-16 2017-11-07 深圳和而泰智能家居科技有限公司 Beauty appliance gear recommends method, beauty appliance and storage medium
CN107898107A (en) * 2017-11-17 2018-04-13 张伟 Beauty provides system and method for work
CN108618752A (en) * 2017-03-23 2018-10-09 丽宝大数据股份有限公司 skin product adaptation method and electronic device thereof
CN108814747A (en) * 2018-05-04 2018-11-16 广东小天才科技有限公司 Intelligent electric toothbrush and control method thereof
CN109330449A (en) * 2018-10-23 2019-02-15 北京小米移动软件有限公司 Facial cleansing method and device
CN109635689A (en) * 2018-11-30 2019-04-16 北京小米移动软件有限公司 The method and apparatus and storage medium of cleaning skin
CN109744701A (en) * 2018-02-09 2019-05-14 深圳市洋沃电子有限公司 A kind of hair removal system, hair removal cloud system and hair removal method
CN109868611A (en) * 2017-12-01 2019-06-11 青岛海尔洗衣机有限公司 Control method for washing machine and washing machine
CN109948476A (en) * 2019-03-06 2019-06-28 南京七奇智能科技有限公司 A kind of face skin detection system based on computer vision and its implementation
CN110663241A (en) * 2017-03-31 2020-01-07 莱雅公司 Cosmetic device system with communication and power interface for cosmetic device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106264523A (en) * 2015-06-03 2017-01-04 武汉朗立创科技有限公司 Skin protection suggesting method based on skin and environment measuring and system
CN106469302A (en) * 2016-09-07 2017-03-01 成都知识视觉科技有限公司 A kind of face skin quality detection method based on artificial neural network
CN108618752A (en) * 2017-03-23 2018-10-09 丽宝大数据股份有限公司 skin product adaptation method and electronic device thereof
CN110663241A (en) * 2017-03-31 2020-01-07 莱雅公司 Cosmetic device system with communication and power interface for cosmetic device
CN107330747A (en) * 2017-05-16 2017-11-07 深圳和而泰智能家居科技有限公司 Beauty appliance gear recommends method, beauty appliance and storage medium
CN107239671A (en) * 2017-06-27 2017-10-10 京东方科技集团股份有限公司 A kind of management method of skin condition, device and system
CN107898107A (en) * 2017-11-17 2018-04-13 张伟 Beauty provides system and method for work
CN109868611A (en) * 2017-12-01 2019-06-11 青岛海尔洗衣机有限公司 Control method for washing machine and washing machine
CN109744701A (en) * 2018-02-09 2019-05-14 深圳市洋沃电子有限公司 A kind of hair removal system, hair removal cloud system and hair removal method
CN108814747A (en) * 2018-05-04 2018-11-16 广东小天才科技有限公司 Intelligent electric toothbrush and control method thereof
CN109330449A (en) * 2018-10-23 2019-02-15 北京小米移动软件有限公司 Facial cleansing method and device
CN109635689A (en) * 2018-11-30 2019-04-16 北京小米移动软件有限公司 The method and apparatus and storage medium of cleaning skin
CN109948476A (en) * 2019-03-06 2019-06-28 南京七奇智能科技有限公司 A kind of face skin detection system based on computer vision and its implementation

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112333439A (en) * 2020-10-30 2021-02-05 南京维沃软件技术有限公司 Face cleaning equipment control method and device and electronic equipment
CN112641436A (en) * 2020-11-05 2021-04-13 西安拾玖岁信息科技有限公司 Cosmetic method and device
CN113331792A (en) * 2021-07-02 2021-09-03 北京美医医学技术研究院有限公司 Skin care prompt system based on skin condition
CN113610844A (en) * 2021-08-31 2021-11-05 深圳市邻友通科技发展有限公司 Intelligent skin care method, device, equipment and storage medium
CN114027969A (en) * 2021-12-13 2022-02-11 上海澄镜科技有限公司 Intelligent radio frequency beauty instrument and working method
CN114873045A (en) * 2022-06-10 2022-08-09 广州数美生物科技有限公司 A skin care product storage box
CN116035527A (en) * 2022-12-29 2023-05-02 深圳市沃特沃德信息有限公司 Method, device, equipment and storage medium for full face coverage based on selected functions
CN116035527B (en) * 2022-12-29 2025-09-05 深圳市沃特沃德信息有限公司 Method, device, equipment and storage medium for full face coverage based on selected functions
CN118602522A (en) * 2024-06-27 2024-09-06 Tcl空调器(中山)有限公司 Air conditioner control method, device, air conditioner and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN111759203A (en) Intelligent face cleaning and skin care method, face cleaning and skin care instrument and computer readable storage medium
JP7553621B2 (en) Apparatus and method for determining cosmetic skin attributes - Patents.com
CN112084965B (en) Scalp hair detection device and system
EP3249562B1 (en) Method for obtaining care information, method for sharing care information, and electronic apparatus therefor
US11858154B2 (en) Operating a personal care device
US11080893B2 (en) Analysis unit and system for assessment of hair condition
JP2023040229A (en) Apparatus and method for visualizing cosmetic skin attributes
US11653873B2 (en) Skin detection device and product information determination method, device and system
JPWO2019208703A1 (en) Information processing device
EP3528690B1 (en) Accessory device and imaging device
WO2018202065A1 (en) Artificial intelligence-based system for recommending cosmetic product and service
CN113377020B (en) Equipment control method, device, equipment and storage medium
JP2022553431A (en) hair removal instructions
KR101949152B1 (en) Method and Appartus for Skin Condition Diagnosis and System for Providing Makeup Information suitable Skin Condition Using the Same
US20240032856A1 (en) Method and device for providing alopecia information
CN118215433A (en) Method and system for characterizing keratin fibres, in particular human eyelashes
WO2024182890A1 (en) Method and system for quantifying pain using electroencephalogram signals
KR20220156034A (en) How to identify dendritic pores
CN116747431B (en) Beauty instrument action position detection and energy output method, detection device, and beauty instrument
CN118873852A (en) Intelligent control method and device for scalp care equipment
JP2005242535A (en) Image correction device
CN115998247A (en) Cosmetic supply device and cosmetic supply method
CN119279572B (en) Hearing test method, device and storage medium
KR102234006B1 (en) Method and collect hair information
CN112767334B (en) Skin problem detection method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201013

RJ01 Rejection of invention patent application after publication
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载