+

WO2018164394A1 - Procédé et système permettant de procurer des informations sur le résultat de procédure, support d'enregistrement lisible par ordinateur non transitoire - Google Patents

Procédé et système permettant de procurer des informations sur le résultat de procédure, support d'enregistrement lisible par ordinateur non transitoire Download PDF

Info

Publication number
WO2018164394A1
WO2018164394A1 PCT/KR2018/002028 KR2018002028W WO2018164394A1 WO 2018164394 A1 WO2018164394 A1 WO 2018164394A1 KR 2018002028 W KR2018002028 W KR 2018002028W WO 2018164394 A1 WO2018164394 A1 WO 2018164394A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
procedure
information
user
change
Prior art date
Application number
PCT/KR2018/002028
Other languages
English (en)
Korean (ko)
Inventor
김진수
최흥산
김희진
최종우
허창훈
Original Assignee
주식회사 모르페우스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020170155144A external-priority patent/KR102006019B1/ko
Application filed by 주식회사 모르페우스 filed Critical 주식회사 모르페우스
Publication of WO2018164394A1 publication Critical patent/WO2018164394A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration

Definitions

  • the present invention relates to a method, system and non-transitory computer readable recording medium for providing result information for a procedure.
  • the procedure is obtained by acquiring a pre-surgical face image and a post-surgical face image, and dividing the screen so that the pre- and post-operative face images can be displayed on one screen.
  • the system has been introduced to make it easier to visually compare the changed face changes.
  • the present invention aims to solve all of the above-mentioned problems of the prior art.
  • Another object of the present invention is to be able to grasp quantitatively information about a change in the face of the user due to the procedure.
  • Another object of the present invention is to enable an objective and specific evaluation of the result of the procedure, such as whether the procedure is performed well.
  • a method of providing result information for a procedure comprising: obtaining three-dimensional measurement data about a shape of a face before and after a procedure of a user, by referring to the obtained three-dimensional measurement data Calculating information regarding a change in the face of the user according to the procedure, and generating procedure result information based on an evaluation index associated with the procedure with reference to the calculated information about the change in the face of the user; A method is provided.
  • a system for providing result information on a procedure the measurement data acquisition unit for obtaining three-dimensional measurement data about the shape of the face before and after the procedure of the user, the obtained three-dimensional measurement data Reference to the change information management unit for calculating the information on the change of the user's face according to the procedure, and the operation result information based on the evaluation index associated with the procedure by referring to the information on the calculated face change of the user
  • a system including a result information management unit for generating is provided.
  • non-transitory computer readable recording medium for recording another method for implementing the present invention, another system, and a computer program for executing the method.
  • FIG. 1 is a view showing a schematic configuration of an entire system for providing result information for a procedure according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing in detail the internal configuration of the information providing system according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a process of providing result information for a procedure according to an embodiment of the present invention.
  • FIG. 4 is a diagram exemplarily illustrating facial feature points that may be used to calculate information about a face change according to an exemplary embodiment.
  • 5 to 23 are diagrams exemplarily illustrating information about a face change, according to an exemplary embodiment.
  • FIG. 1 is a view showing a schematic configuration of an entire system for providing result information for a procedure (procedure) according to an embodiment of the present invention.
  • the entire system may include a communication network 100, an information providing system 200, and a device 300.
  • the communication network 100 may be configured regardless of a communication mode such as wired communication or wireless communication, and includes a local area network (LAN) and a metropolitan area network (MAN). ), And various communication networks such as a wide area network (WAN).
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • the communication network 100 as used herein may be a known Internet or World Wide Web (WWW).
  • WWW World Wide Web
  • the communication network 100 may include, at least in part, a known wired / wireless data communication network, a known telephone network, or a known wired / wireless television communication network without being limited thereto.
  • the communication network 100 is a wireless data communication network, and includes Wi-Fi communication, Wi-Fi Direct communication, Long Term Evolution (LTE) communication, Bluetooth communication (for example, low power Bluetooth BLE (Bluetooth Low Energy), infrared communication, ultrasonic communication, etc. may be implemented at least in part.
  • Wi-Fi communication Wi-Fi Direct communication
  • Wi-Fi Direct communication Wi-Fi Direct communication
  • LTE Long Term Evolution
  • Bluetooth communication for example, low power Bluetooth BLE (Bluetooth Low Energy), infrared communication, ultrasonic communication, etc. may be implemented at least in part.
  • the information providing system 200 may perform communication with the device 300 to be described later through the communication network 100, and relates to the shape of the face before and after the procedure of the user.
  • Acquiring three-dimensional measurement data, calculating information on the change of the user's face according to the procedure by referring to the acquired three-dimensional measurement data, and referring to the calculated information on the change of the user's face, The function of generating procedure result information based on the associated evaluation index may be performed.
  • the above-described procedure may be a concept that collectively refers to all the surgery, correction, beauty, etc., in which the shape of the user's face may change.
  • the configuration and function of the information providing system 200 according to the present invention will be described in detail through the following detailed description. Meanwhile, the information providing system 200 has been described as above, but this description is exemplary, and at least some of the functions or components required for the information providing system 200 will be described later as necessary. It will be apparent to those skilled in the art that they may be implemented within an external system (not shown) or may be included in the device 300 or an external system (not shown).
  • the device 300 is a digital device that includes a function to enable communication after connecting to the information providing system 200 through the communication network 100, a smart phone, a tablet
  • a portable digital device having a memory means such as a PC and the like having a microprocessor installed therein can be adopted as the device 300 according to the present invention.
  • the device 300 may include an application for supporting a function according to the present invention for providing result information on a procedure.
  • an application may be downloaded from the information providing system 200 or an external application distribution server (not shown).
  • FIG. 2 is a diagram showing in detail the internal configuration of the information providing system 200 according to an embodiment of the present invention.
  • the information providing system 200 may be a digital device having a computing power by including a memory means and a microprocessor.
  • the information providing system 200 may be a server system.
  • the information providing system 200 includes a measurement data acquisition unit 210, a change information management unit 220, a result information management unit 230, a communication unit 240, and a control unit 250.
  • the information providing system 200 includes a measurement data acquisition unit 210, a change information management unit 220, a result information management unit 230, a communication unit 240, and a control unit 250. At least a part may be a program module in communication with an external system.
  • Such program modules may be included in the information providing system 200 in the form of an operating system, an application program module, or other program modules, and may be physically stored in various known storage devices.
  • the program module may be stored in a remote storage device that can communicate with the information providing system 200.
  • program modules include, but are not limited to, routines, subroutines, programs, objects, components, data structures, etc. that perform particular tasks or execute particular abstract data types, described below, in accordance with the present invention.
  • the measurement data acquisition unit 210 may acquire three-dimensional measurement data regarding the shape of the face before and after the procedure.
  • the three-dimensional measurement data according to an embodiment of the present invention may include at least one of three-dimensional scanning data of the user's face and three-dimensional imaging data of the hard tissue of the user's face.
  • the measurement data acquisition unit 210 in order to obtain the three-dimensional scanning data or three-dimensional imaging data, x-ray (ultrasound), ultrasound (ultrasonic wave), computed tomography
  • x-ray ultrasound
  • ultrasound ultrasonic wave
  • computed tomography The information obtained from at least one of a computer tomography (CT), a magnetic resonance image (MRI), a positron emission tomography (PET), and a 3D scanner may be referred to.
  • CT computer tomography
  • MRI magnetic resonance image
  • PET positron emission tomography
  • 3D scanner 3D scanner
  • the measurement data acquisition unit 210 may acquire 3D measurement data regarding the shape of the user's face based on the 2D measurement data regarding the shape of the user's face.
  • the measurement data acquisition unit 210 acquires two-dimensional measurement data (eg, a picture, a picture, etc.) related to the shape of the user's face, and uses a perspective projection conversion algorithm (The user's face included in the two-dimensional measurement data obtained above is converted into three-dimensional data based on a predetermined three-dimensional transformation algorithm such as perspective projection transformation (for example, a predetermined point of the user's two-dimensional face data is 3D measurement data regarding the shape of the user's face can be obtained by converting into 3D data as a reference.
  • a predetermined point of the user's two-dimensional face data is 3D measurement data regarding the shape of the user's face can be obtained by converting into 3D data as a reference.
  • the change information management unit 220 calculates information about the change of the user's face according to the procedure by referring to the three-dimensional measurement data obtained by the measurement data acquisition unit 210 above. can do.
  • the information about the face change includes the length between predetermined face points, the distance between predetermined face points, the angle between a plurality of points of the face, the height between predetermined face points, and the predetermined face area. Compared values may be included with respect to at least one of the area, the volume of the predetermined face area, and the ratio between the predetermined face points.
  • the change information management unit 220 may calculate information on the face change based on reference feature points specified in each of the user's face before and after the procedure. More specifically, the change information management unit 220 according to an embodiment of the present invention is predetermined in the pre-surgical face (that is, facial image) of the user obtained from the three-dimensional measurement data regarding the shape of the face before the procedure of the user.
  • the reference feature may be specified, and a predetermined reference feature may be specified in the post-surface face (that is, the face image) of the user obtained from 3D measurement data regarding the shape of the face after the user's procedure.
  • the change information management unit 220 is compared information calculated based on reference feature points specified on each of the user's face (that is, facial image) before and after the procedure (for example, for example, the length between the reference feature points of the face, the distance between the reference feature points of the face, the angle between a plurality of reference feature points of the face, the height between the reference feature points of the face, the area of a predetermined region including the reference feature points of the face, the reference of the face Information relating to at least one of the volume between the volume of the predetermined region including the feature point and the ratio between the reference feature points of the face and the procedure before and after the procedure) can be calculated as the information on the face change.
  • the reference feature point may be a concept including an anatomical main part, a blood spot, and the like for main parts of the eye, nose, mouth, and the like.
  • the change information management unit 220 compares and analyzes the face of the user before the procedure and the face of the user at a plurality of time points after the procedure, respectively, to determine a change in the face of the user. Information can be calculated.
  • the change information management unit 220 may have a plurality of cycles (eg, 1 week after the procedure, 2 weeks after the procedure, 3 weeks after the procedure) of the user's face and the procedure before the procedure.
  • a plurality of cycles eg, 1 week after the procedure, 2 weeks after the procedure, 3 weeks after the procedure
  • the information on the change of the face of the user according to the progress of the procedure can be calculated as the information on the face change.
  • the change information management unit 220 refers to the user by referring to at least one anatomical face model associated with the user information and three-dimensional measurement data on the shape of the face before and after the procedure.
  • the shape and position of at least one kind of anatomical layer included in the soft tissue of each face before and after the procedure can be estimated.
  • the change information management unit 220 according to an embodiment of the present invention may calculate information on the change of the face of the user by referring to the face of the user before and after the procedure specified by the above estimation. have.
  • the anatomical face model according to an embodiment of the present invention may include modeled data regarding the shape and location of at least one type of anatomical layer included in the soft tissue of the face.
  • such an anatomical face model is obtained by scanning a peeled (or unpeeled) layer while peeling off the anatomical layers of soft tissues of the face of a dead human body (ie, the dead body).
  • the anatomical face model includes at least one type of anatomical layer that can be included in the soft tissues of the face and Data modeled with respect to location may be included.
  • the data obtained from the support vector machine (SVM) algorithm, multivariate adaptive regression spline (MARS) algorithm, nearest neighbor (KNN) algorithm, neural network ( NN) may include an anatomical face model obtained by analyzing through known algorithms for deep learning or big data, such as an anatomical layer of the face according to the demographic indicators It may be a normalized anatomical face model.
  • the above-described anatomical layer according to an embodiment of the present invention may include a layer relating to at least one of muscle, fat, blood vessels, nerves and lymphatic vessels, and the above-described user information includes the race, ethnicity, gender, and age of the user. Information may be included.
  • the type of the anatomical layer and the user information of the above according to an embodiment of the present invention is not necessarily limited to those listed above, periosteum, fascia within the scope that can achieve the object of the present invention (fascia), ligament (ligament) and the like (anatomical layer), such as the country, the user can be changed, such as the region of residence.
  • the change information management unit 220 may display features of a user's face, which are extracted from three-dimensional measurement data of each user's shape before and after the procedure.
  • the shape and location of at least one type of anatomical layer included in the anatomical face model associated with the user information is compared to the shape of each face before and after the procedure of the user, compared to the at least one anatomical face model associated with the user information.
  • the shape or position of at least one type of anatomical layer included in the soft tissue of each face before and after the procedure of the user may be estimated.
  • the change information management unit 220 compares and analyzes three-dimensional measurement data regarding the shape of the face before and after the procedure of the user and at least one anatomical face model associated with the user information.
  • an active appearance model (AAM) algorithm an active shape model (ASM) algorithm, a composite constraint active expression model (Composite Constraint AAM) algorithm, an iterative closest points (ICP) algorithm
  • non-rigid registration algorithms an active appearance model (AAM) algorithm, an active shape model (ASM) algorithm, a composite constraint active expression model (Composite Constraint AAM) algorithm, an iterative closest points (ICP) algorithm And non-rigid registration algorithms.
  • the change information management unit 220 may include an active appearance model (AAM) algorithm, an active shape model (ASM) algorithm, and a complex constraint active expression model (Composite).
  • AAM active appearance model
  • ASM active shape model
  • Compposite complex constraint active expression model
  • the user's face feature may be extracted from three-dimensional measurement data regarding the shape of the face before and after the user's procedure, and iterative closest points (ICP) are extracted.
  • ICP iterative closest points
  • the shape and position of at least one type of anatomical layer included in the anatomical face model is applied to the shape of each face before and after the procedure of the user.
  • an algorithm for comparing and analyzing three-dimensional measurement data about a shape of a face and at least one anatomical face model associated with user information is not necessarily limited to those listed above. It is noted that other algorithms may be utilized within the scope of the object of the present invention.
  • the change information management unit 220 may determine information about the change of the user's face by referring to the face of the user before and after the procedure specified by the above estimation.
  • the change information management unit 220 is a face of the user before the procedure specified by the above estimation and a face of the user after the procedure specified by the above estimation (or a plurality of post-procedures). By comparing and analyzing the user's face at a time point, it is possible to determine information on the change of the user's face.
  • the change information management unit 220 the shape and position of at least one type of anatomical layer included in the soft tissue of each of the user's face before and after the procedure specified by the above estimation
  • the information regarding the change of may be determined as the information about the change of the face of the user.
  • change information management unit 220 the shape and location of at least one type of anatomical layer included in the soft tissue of the user's face before the procedure, and the user's face (or Analyze and compare the shape and position of at least one type of anatomical layer included in the soft tissue of the user's face at a plurality of time points after the procedure, thereby analyzing at least one type of anatomical included in the soft tissue of the user's face.
  • the information on the shape and position of the layer that changes before and after the procedure (for example, the change in the thickness of the bone between before and after the procedure and the change in the thickness of the fat layer between and after the procedure) is used as information about the change of the user's face. You can decide.
  • the result information manager 230 refers to information about the change of the face of the user calculated by the change information manager 220 based on the evaluation index associated with the procedure. Information can be generated.
  • the evaluation index associated with the procedure, the goal of the procedure, the effect or performance of the drug used in the procedure, how much change or how much change in which part of the user's face through the procedure It can be a concept that includes what should be done.
  • the evaluation indicators associated with such a procedure may be set based on result information on a plurality of user's procedures using a known machine learning or big data algorithm.
  • the evaluation indicators associated with such a procedure may be set according to user information such as age, country, gender of the user, the type of the procedure, the type of the drug used in the procedure, and the like.
  • the result information management unit 230 when the distance between the predetermined point or the ratio between the predetermined point is changed in the face of the user due to the procedure, the corresponding change distance or the corresponding change
  • the ratio By analyzing the ratio by comparing the ratio with the numerical range according to the evaluation index associated with the procedure received by the user, it is possible to generate the treatment result information about the degree of lack (or excess) of the evaluation index, the satisfaction of the evaluation index, and the like.
  • the result information management unit 230 when a certain drug (for example, filler) is injected for the procedure to change the volume of a predetermined area of the user's face, The volume change is analyzed by comparing the volume change with the numerical range according to the evaluation index (e.g., the effect of the volume change due to the drug) associated with the procedure received by the user, so as to be insufficient (or over) the evaluation index, and to satisfy the evaluation index.
  • the evaluation index e.g., the effect of the volume change due to the drug
  • the communication unit 240 performs a function to enable the data transmission and reception from / to the measurement data acquisition unit 210, change information management unit 220 and result information management unit 230 can do.
  • the controller 250 controls the flow of data between the measurement data acquisition unit 210, the change information management unit 220, the result information management unit 230, and the communication unit 240. Can be performed. That is, the controller 250 according to the present invention controls the data flow from / to the outside of the information providing system 200 or the data flow between each component of the information providing system 200, thereby obtaining the measurement data obtaining unit 210.
  • the change information manager 220, the result information manager 230, and the communicator 240 may control to perform unique functions.
  • FIG. 3 is a diagram illustrating a process of providing result information for a procedure according to an embodiment of the present invention.
  • FIG. 4 is a diagram exemplarily illustrating facial feature points that may be used to calculate information about a face change according to an exemplary embodiment.
  • 5 to 23 are diagrams exemplarily illustrating information about a face change, according to an exemplary embodiment.
  • the information providing system 200 may include three-dimensional computed tomography data for hard tissue of each face before and after the procedure, or before and after the procedure. Three-dimensional scanning data for each of the faces may be obtained (301 and 302).
  • the information providing system 200 selects at least one anatomical face model 303, 304 associated with at least one of the user's race, gender, and age,
  • the shape or position of at least one type of anatomical layer included in the selected at least one anatomical face model 303, 304 is determined using the non-rigid registration algorithms 305 and 306, before and after the procedure.
  • the shape or position of at least one type of anatomical layer included in the soft tissue of each face before and after the procedure can be estimated (307). , 308).
  • the information providing system 200 refers to information 311 regarding the face change by referring to the faces 309 and 310 of the user before and after the procedure specified by the above estimation. ) Can be calculated.
  • the above-described face change information 311 may be calculated using reference feature points of the face of the user before and after the procedure.
  • the information providing system 200 is each of the user's face before and after the procedure (specifically, two weeks after the procedure, three weeks after the procedure)
  • Information about the volume change of the upper, middle, lower and upper faces of the can be calculated as the information 311 on the face change.
  • the information about the volume change of the face, the user's face is divided into the top 501, middle 502 and the bottom 503, and each of the divided areas are again the center line of the face
  • the information may be measured based on volumes of six regions that are generated by dividing left and right, respectively.
  • the total volume of the user's face may be calculated by adding up the volumes of the six areas.
  • the information about the facial volume change may be divided into soft tissues and cervical tissues and measured, and may be information used to generate result information on a procedure according to facial asymmetry correction. have.
  • the information providing system 200 may include depth by contours of a user's face before and after a procedure (specifically, three weeks after the procedure). ) Information on the change can be calculated as information 311 on the face change. According to an embodiment of the present invention, the information about the change of the contour depth may be information measured based on a contour that is sequentially generated at 0.25 mm intervals from the tip of the nose of the cross section perpendicular to the front and back direction of the user's face. have.
  • the relative height change of each part of the face can be easily determined by the contour line by using the information about the change of the contour depth, and the measurement of the right and left asymmetry of the face and the smoothness of the curved surface of the face are also measured.
  • the information on the contour depth change, facial contour surgery to reduce the left and right face width such as Asian for shorter front and back length compared to Westerners, forehead, nose, chin tip, etc. It may be information that is used to generate the result information for the procedure, such as forehead plastic surgery, rhinoplasty or jaw plastic surgery.
  • the information providing system 200 faces information regarding a change in color by depthmap according to the depth of the face of the user before and after the procedure. It can calculate as information 311 regarding a change.
  • the information about the color change according to the depth of the face is represented by changing the color of the face height information such as the depth by contour above, so that the relative height change of each part of the user's face is changed. It can be easily judged through color comparison.
  • the information on the color change according to the depth of the face generates the result information for the procedures such as forehead plastic surgery, nose plastic surgery, nose plastic surgery, such as forehead, nose, chin tip and back Information may be used to
  • the information providing system 200 may be used before or after the procedure (specifically, after 3 weeks after the procedure 2 weeks).
  • the information about the change in the projection distance of the front and back (AP projection; Anterior-Posterior projection) of the face may be calculated as the information 311 on the change of the face.
  • the information about the front-rear projection distance change may be information in which the front-rear distance 1101 from the reference feature point of the user's face to the reference plane behind the face is measured.
  • the measured information may be information measured based on a front-rear distance from a predetermined point of the user's face to a reference plane.
  • the reference plane may mean a plane that passes through the average position of the left and right ear beads and is perpendicular to the front and back direction of the face.
  • the information about the front-rear projection change is used to generate the result information on the procedures such as forehead, nose, and chin, which lifts the forehead, nose, and chin tip back and forth. Information.
  • the information providing system 200 is a user before and after the procedure (specifically, after 2 weeks after the procedure, 3 weeks after the procedure).
  • Information about a change in nose and lips length of a face may be calculated as information 311 regarding a change in face.
  • the information on the nose and lip distance changes may include various reference feature points related to the distance between the nose and the lips such as a nasal bridge length, a nasal bridge height, an aesthetic line, and the like. It may be information measured based on the length between.
  • the information about the distance change of the nose and the lips may indicate a change in the relative length or the absolute length associated with the nose and the lips of the user, and may include nose plastic surgery, jaw correction surgery, and jaw surgery. It may be information used to generate result information on a procedure such as surgery.
  • the information providing system 200 may include a nose and lips angle of the face and nose of the user before and after the procedure.
  • Information on the change can be calculated as information 311 on the face change.
  • the information about the change of the angle of the nose and the lips may include various reference feature points related to the nose and lip angles such as nasofrontal angel, nasolabial angle, and labiomental angle. It may be information measured based on the angle of.
  • the information about the change of the angle of the nose and the lips may be information used to generate result information for a procedure such as nose plastic surgery and jaw surgery.
  • the information providing system 200 may provide information regarding a change in the curved length of the face of the user before and after the procedure. It can calculate as information 311 regarding a face change.
  • the information on the change in the curve distance may be calculated by measuring a change in the curve distance from the tragus point of the ear to the reference feature point such as the eyes, the nose, the mouth, or the chin for each of the left and right sides of the user's face. Information.
  • the information on the change in the curve distance may be information used to generate result information on a procedure such as eye surgery, nose surgery, facial contour surgery, wrinkle surgery, or the like.
  • the information providing system 200 may measure distances based on a plurality of reference feature points of a face of a user before and after a procedure.
  • the information on the angle change can be calculated as the information 311 on the face change.
  • the information providing system 200 according to an embodiment of the present invention, the shape and position of the at least one type of anatomical layer included in the soft tissue of each of the user's face before the procedure specified by the above estimation 307 And comparing and analyzing the shape and position 308 of at least one type of anatomical layer included in each soft tissue of the user's face after the procedure, so that the shape of each of the at least one type of anatomical layer included in the soft tissue of the user's face is analyzed. And information in which the position changes before and after the procedure as the information 311 about the change of the face of the user.
  • the information providing system 200 refers to the evaluation index 312 associated with the procedure with reference to the information 311 regarding the face change of the user calculated above.
  • the procedure result information may be generated 313.
  • the information providing system 200 may change the numerical values 801, 1201, 1401, and 1601 that are changed in the face of the user before and after the procedure from the information about the calculated face change of the user. , 1801, 1901, 2001, 2101, 2201, 2301), and by comparing the calculated numerical information with the evaluation indicators associated with the procedure, whether the evaluation index associated with the procedure was achieved through the procedure.
  • the procedure result information 313 including the evaluation information related to the information may be generated.
  • Embodiments according to the present invention described above can be implemented in the form of program instructions that can be executed by various computer components and recorded in a computer-readable recording medium.
  • the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the computer-readable recording medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and magneto-optical media such as floptical disks. medium) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device may be modified with one or more software modules to perform the processing according to the present invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Selon un aspect, la présente invention concerne un procédé permettant de procurer des informations sur le résultat d'une procédure, comprenant : une étape d'acquisition de données de mesure 3D de la forme d'un visage d'un utilisateur avant et après une procédure ; une étape consistant à calculer des informations liées à un changement du visage de l'utilisateur en raison de la procédure en se référant aux données de mesure 3D acquises ; et une étape consistant à produire des informations sur le résultat de la procédure, qui est basé sur un indicateur d'évaluation associé à la procédure, en se référant aux informations calculées sur le changement du visage de l'utilisateur.
PCT/KR2018/002028 2017-03-10 2018-02-19 Procédé et système permettant de procurer des informations sur le résultat de procédure, support d'enregistrement lisible par ordinateur non transitoire WO2018164394A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2017-0030745 2017-03-10
KR20170030745 2017-03-10
KR10-2017-0155144 2017-11-20
KR1020170155144A KR102006019B1 (ko) 2017-03-10 2017-11-20 시술에 대한 결과 정보를 제공하는 방법, 시스템 및 비일시성의 컴퓨터 판독 가능 기록 매체

Publications (1)

Publication Number Publication Date
WO2018164394A1 true WO2018164394A1 (fr) 2018-09-13

Family

ID=63448321

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/002028 WO2018164394A1 (fr) 2017-03-10 2018-02-19 Procédé et système permettant de procurer des informations sur le résultat de procédure, support d'enregistrement lisible par ordinateur non transitoire

Country Status (1)

Country Link
WO (1) WO2018164394A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4328621B2 (ja) * 2001-10-31 2009-09-09 イマグノーシス株式会社 医療用シミュレーション装置
KR20120096238A (ko) * 2011-02-22 2012-08-30 주식회사 모르페우스 안면보정 이미지 제공방법 및 그 시스템
KR101438011B1 (ko) * 2013-11-08 2014-09-04 수원대학교산학협력단 3차원 스캐너를 이용한 얼굴 인식 시스템
KR20170025162A (ko) * 2015-08-27 2017-03-08 연세대학교 산학협력단 얼굴 영상의 얼굴 나이 변환 방법 및 그 장치

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4328621B2 (ja) * 2001-10-31 2009-09-09 イマグノーシス株式会社 医療用シミュレーション装置
KR20120096238A (ko) * 2011-02-22 2012-08-30 주식회사 모르페우스 안면보정 이미지 제공방법 및 그 시스템
KR101438011B1 (ko) * 2013-11-08 2014-09-04 수원대학교산학협력단 3차원 스캐너를 이용한 얼굴 인식 시스템
KR20170025162A (ko) * 2015-08-27 2017-03-08 연세대학교 산학협력단 얼굴 영상의 얼굴 나이 변환 방법 및 그 장치

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHANGHUN HUH: "V3D Evaluation of Facial Beauty: Normal and Super-Normal", 3RD GALAA CONFERENCE, 18 February 2017 (2017-02-18), Bangkok, Thailand, pages 8 - 42 *

Similar Documents

Publication Publication Date Title
KR102006019B1 (ko) 시술에 대한 결과 정보를 제공하는 방법, 시스템 및 비일시성의 컴퓨터 판독 가능 기록 매체
US7760923B2 (en) Method and system for characterization of knee joint morphology
Naudi et al. The virtual human face: superimposing the simultaneously captured 3D photorealistic skin surface of the face on the untextured skin image of the CBCT scan
US20240057869A1 (en) System, method and kit for 3d body imaging
Munn et al. Changes in face topography from supine-to-upright position—and soft tissue correction values for craniofacial identification
CN112704486B (zh) 基于电磁仿真计算的tms线圈位姿图谱生成方法
US20150255005A1 (en) Movement evaluation device and program therefor
CN103679816B (zh) 一种面向刑侦的未知身源颅骨的计算机辅助面貌复原方法
CN103955675A (zh) 一种人脸特征提取方法
JP2016085490A (ja) 顔形態の評価システム及び評価方法
WO2018164329A1 (fr) Procédé et système de fourniture d'informations concernant un visage à l'aide d'une couche anatomique et support d'enregistrement non transitoire lisible par ordinateur
WO2018164394A1 (fr) Procédé et système permettant de procurer des informations sur le résultat de procédure, support d'enregistrement lisible par ordinateur non transitoire
Christensen et al. Automatic measurement of the labyrinth using image registration and a deformable inner ear atlas
US9020192B2 (en) Human submental profile measurement
TWI731447B (zh) 美容促進裝置、美容促進系統、美容促進方法、及美容促進程式
WO2018164328A1 (fr) Procédé et système d'estimation de visage au moyen d'une couche anatomique, et support d'enregistrement non transitoire lisible par ordinateur
WO2018164327A1 (fr) Procédé et système d'estimation de couche anatomatique d'un visage, et support d'enregistrement lisible par ordinateur non transitoire
Park et al. Quantitative evaluation of facial sagging in different body postures using a three‐dimensional imaging technique
JP2009054060A (ja) 顔の形状評価方法
CA3170767A1 (fr) Systeme, procede et appareil de mesure d'asymetrie de temperature de parties de corps
KR102475962B1 (ko) 임상 영상의 시뮬레이션 방법 및 장치
Takwale et al. A practical guide to the standardization of hair loss photography for clinicians
Grewe et al. Fast and accurate digital morphometry of facial expressions
US20250009274A1 (en) Electrocardiogram lead generation
WO2022173055A1 (fr) Procédé, dispositif, programme et système d'estimation de squelette, procédé de génération de modèle formé et modèle formé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18763946

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 05.12.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 18763946

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载