US20080049993A1 - System and method for counting follicular units - Google Patents
System and method for counting follicular units Download PDFInfo
- Publication number
- US20080049993A1 US20080049993A1 US11/467,283 US46728306A US2008049993A1 US 20080049993 A1 US20080049993 A1 US 20080049993A1 US 46728306 A US46728306 A US 46728306A US 2008049993 A1 US2008049993 A1 US 2008049993A1
- Authority
- US
- United States
- Prior art keywords
- image
- kernel
- hair
- filtering
- selected image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06M—COUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
- G06M1/00—Design features of general application
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00743—Type of operation; Specification of treatment sites
- A61B2017/00747—Dermatology
- A61B2017/00752—Hair removal or transplantation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/10—Hair or skin implants
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20036—Morphological image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
Definitions
- This invention relates generally to hair transplantation procedures and more particularly to a system and method for counting follicular units using digital imaging and processing techniques for use in hair transplantation procedures.
- Hair transplantation procedures are well-known, and typically involve (in a patient having male pattern baldness) harvesting donor hair grafts from the side and back fringe areas (donor areas) of the patient's scalp, and implanting them in a bald area (recipient area).
- the harvested grafts were relatively large (3-5 mm), although more recently, the donor grafts may be single follicular units.
- “follicular units” also referred to herein as FU or FUs
- FU closely spaced hair follicles that are distributed randomly over the surface of the scalp.
- the follicular units may be classified, or “typed,” based on the number of hairs in the unit and identified in shorthand as an “F1” for a single hair follicular unit, an “F2” for a two hair follicular unit and so on for follicular units with 3-5 hairs.
- the hairs may appear to emanate from a single follicle or point in the skin.
- the hairs may exit the skin surface at slightly spaced apart positions, but converge into a single follicular unit beneath the skin.
- FIG. 1 a print of a digital image of an exemplary section of a human scalp 11 having a variety of types of follicular units is shown.
- the follicular unit 13 has two hairs and is therefore an F2, while follicular unit 15 is an F1 since it has only a single hair.
- follicular unit 17 appears to be an F3 having three hairs.
- the number of follicular units can be used in the planning process for a transplantation procedure. For instance, if this number sets the limit on the number of follicular units in that area that can be harvested for transplantation. However, in many cases, the doctor may want to implant only a certain percentage of the follicular units available, thereby leaving some coverage in the area being harvested. In addition, in many hair restoration transplant procedures, certain classes of follicular units are preferred.
- follicular units As for classification, there are several reasons it is important and desirable to identify and classify follicular units based on the number of hairs in the follicular unit. It may be desirable to utilize a variety of classes (also referred to as “types”) of follicular units to provide the desired attributes for the appearance of the transplanted hair. Such attributes can include the density of hair, the direction or orientation of hair, the particular mix of types of follicular units, and/or the appearance of randomness, among other possible attributes.
- An example of the use of various types of follicular units is as follows. It is preferable to transplant certain classes of follicular units into specific regions of the scalp. For example, single hair follicular units (F1s) are commonly implanted along the hairline that frames the face. Follicular units with more than one hair (F2s, F3s, etc.) are commonly implanted in the mid-scalp and crown. This arrangement of follicular unit distribution is thought to produce a more natural appearing aesthetic result
- a linear portion of the scalp is removed from a donor area by dissection with a scalpel down into the fatty subcutaneous tissue.
- the strip is dissected (under a microscope) into the component follicular units, which are then implanted into a recipient area in respective puncture holes made by a needle.
- Forceps are typically used to grasp and place the follicular unit grafts into the needle puncture locations, although other instruments and methods are known for doing so.
- M. Inaba & Y. Inaba disclose and describe a manual method for harvesting singular follicular units by positioning a hollow punch needle having a cutting edge and interior lumen with a diameter of 1 mm, which is about equal to the diameter of critical anatomical parts of a follicular unit.
- the needle punch is axially aligned with an axis of a follicular unit to be extracted and then advanced into the scalp to cut the scalp about the circumference of the selected follicular unit. Thereafter, the follicular units are easily removed, e.g., using forceps, for subsequent implantation into a recipient site with a specially devised insertion needle.
- U.S. Pat. No. 6,585,746 discloses an automated hair transplantation system utilizing a robot, including a robotic arm and a hair follicle introducer associated with the robotic arm.
- a video system is used to produce a three-dimensional virtual image of the patient's scalp, which is used to plan the scalp locations that are to receive hair grafts implanted by the follicle introducer under the control of the robotic arm.
- the entire disclosure of U.S. Pat. No. 6,585,746 is incorporated herein by reference.
- U.S. Pat. No. 6,585,746 discloses an automated hair transplantation system utilizing a robot, including a robotic arm and a hair follicle introducer associated with the robotic arm.
- a video system is used to produce a three-dimensional virtual image of the patient's scalp, which is used to plan the scalp locations that are to receive hair grafts implanted by the follicle introducer under the control of the robotic arm.
- the entire disclosure of U.S. Pat. No. 6,585,746 is incorporated herein by reference.
- the disclosed system comprises a robotic arm having a harvesting and/or implantation tool mounted on the arm.
- One or more cameras are also mounted on the arm and are used to image the work space, such as a body surface.
- a processor is configured to receive and process images acquired by the cameras.
- a controller is operatively coupled to the processor and the robotic arm. The controller controls the movement of the robotic arm based, at least in part, on the processed images acquired by the cameras and the processor.
- the arm is controllably moveable to position the tool at a desired orientation and position relative to the body surface to perform transplantation of hairs.
- follicular units to be harvested and transplanted it is desirable to first plan the transplantation to select the follicular units to be harvested and transplanted and to determine the precise location where the hairs are to be implanted. Accordingly, in planning a hair transplantation procedure, specific follicular units from a specific location on a body surface may be selected for harvesting and transplantation into a different part of the body surface. The follicular units to be transplanted may be selected based on certain criteria, for example, the type of follicular unit (i.e. F1, F2, etc.), the orientation of the hair in the follicular unit, the density of the hair, etc.
- the process of counting, and characterizing each follicular unit can be tedious and time consuming.
- a system and method for counting follicular units using an automated system is provided.
- the system and method of the present invention may be utilized with systems and methods for transplantation of hair follicular units on a body surface, such as a human scalp.
- the system and method of the present invention is especially useful when implemented on, or integrated with, an automated system for hair transplantation.
- the method of counting follicular units comprises first acquiring a digital image of a body surface having skin and follicular units. A region of interest within the digital image which is known to contain one or more follicular units (FU) is selected. This region of interest within the digital image is called the selected image. The selected image is digitally filtered using a band-pass filter to remove components in the image which correspond to the skin. The filtered image is then segmented to produce a binary image of the region of interest.
- FU follicular units
- a morphological open operation is performed on the binary image to further process the image in preparation for the subsequent noise filtering step.
- a morphological open operation is a standard image processing technique known by those of ordinary skill in the art.
- Noise filtering is then performed on the image resulting from the morphological open operation.
- the noise filtering removes objects which do not meet criteria corresponding to a follicular unit.
- the area, location or orientation of an object in the image may be one whose area, location or orientation does not correspond to an actual follicular unit (it could be cut hair that happens to be remaining on the scalp, for example).
- Whether the characteristics of an image of an object corresponds to hair may be determined by statistical comparison to the global nature of the same characteristics for images of objects in the selected image which are known to be hair, or alternatively, the characteristics can be compared to predetermined criteria based on patient sampling or other data (e.g., if the patient parts the hair in a certain way we know that the hairs should mostly be pointing in a given direction).
- Each of the objects remaining in the image after the noise filtering is counted as a follicular unit.
- the method may be used to count follicular units.
- the step of digitally filtering the selected image using a band-pass filter may comprise a first filtering step using a low-pass filter having a first kernel and a second filtering step using a low-pass filter having a second kernel.
- the low-pass kernels may be Gaussian kernels. Those of ordinary skill in the art are familiar with, and understand how to implement, such low-pass filters and Gaussian filters.
- the step of acquiring the digital image of the body surface includes a method for tracking the FU of interest and aligning the camera(s) to obtain the digital image.
- First and second cameras which provide stereo images.
- the stereo images may be used to track an FU of interest within the digital images of the first and second cameras to adjust for movement of the body surface and/or movement of the cameras.
- the first and second cameras are aligned with the general orientation of the hair of the FU.
- the stereo images may also be used to compute coordinate positions of the hairs. Then, images having a computed coordinate position which is inconsistent with a hair on said body surface can also be filtered out.
- the system for counting follicular units using an automated system may comprise any of the transplantation systems described in the background above.
- the system described in U.S. patent application Ser. No. 11/380,907 may be programmed and configured to perform the methods of counting follicular units according to the present invention.
- the cameras on the system can provide stereo digital images and the robotic arm can properly position and orient the cameras.
- the selection of a region of interest may be performed by an operator at the user interface of the system (such as a computer having a monitor and input devices) or it could be automated through programming of the computer and/or controller.
- FIG. 1 is a print of a digital image of an exemplary section of a human scalp having a plurality of follicular units.
- FIG. 2 is a print of the digital image of FIG. 1 after it has been filtered using a band-pass filter.
- FIG. 3 is a print of the digital image of FIG. 2 after the image has been segmented.
- FIG. 4 is print of the digital image of FIG. 3 after a morphological open operation has been performed on the segmented image.
- FIG. 5 is a print of the digital image of FIG. 4 after noise filtering has been performed on the image.
- the system and method for counting follicular units generally begins with acquiring a digital image 10 of a body surface 11 using one or more digital cameras.
- the body surface 11 has skin 12 and a plurality of follicular units 14 each having one or more hairs 13 (only a few of the follicular units 14 and hairs 13 are labeled in the figures).
- the photo of FIG. 1 is an image of a section of human scalp 11 , but it is understood that the body surface 11 could be any area of any body having hair.
- the digital image 10 shows a variety of types of follicular units 14 (FU) on the scalp 11 .
- the digital image 10 may be acquired using one or more digital cameras of an automated hair transplantation system, such as the cameras described in the hair transplantation system of U.S. patent application Ser. No. 11/380,907, which is incorporated by reference herein in its entirety.
- the image from just one of the cameras can be used to produce the digital image 10 .
- the process for obtaining the digital image 10 may be acquired by a more involved process which aligns the camera(s) to improve the image used to classify a follicular unit of interest.
- a first camera and a second camera are used.
- the cameras are arranged and configured to obtain stereo images of a body surface at which they cameras are directed.
- the cameras are first positioned to be directed at the body surface in an area known to have hair.
- a first digital image is acquired from the first camera and a follicular unit (FU) of interest is selected from within the first digital image.
- a second digital image of about the same region of the body surface as the first camera is acquired from the second camera and the same FU of interest is selected from within the second digital image.
- the FU of interest can be selected in the digital images by an operator of the system or automatically by the system using a selection algorithm.
- the transplantation system is now able to track the FU of interest within the first and second digital images from the first and second cameras.
- the tracking procedure can be used to adjust for movement of the body surface and movement of the cameras when they are aligned to acquire the digital image(s) used for classifying the FU.
- the first and second cameras are moved and oriented to be aligned with the general orientation of the hair of the FU.
- additional digital images may be acquired and processed by the system in order to track the FU of interest.
- a better digital image for classifying the FU can be acquired.
- the cameras acquire the digital images to be used in the next steps of the method of classifying a follicular unit.
- a region of interest 19 which could be the entire digital image 10 or a sub-area is selected.
- the selected region of interest 19 is co-extensive with the digital image 10 .
- the selected region of interest 19 can be any subset area of the digital image 10 .
- the region of interest 19 may be selected by an operator or the selection may be automated by the system. This region of interest within the digital image is called the selected image.
- FIG. 2 shows a print of the digital image after the original selected image has been filtered using a band-pass filter.
- the band-pass filter may comprise any suitable filter as known by those of ordinary skill in the art.
- the selected image may be filtered using a band-pass filter.
- the band-pass filtering can be accomplished by low-pass filtering the selected image twice and then subtracting the two resulting filtered images.
- the band-pass filter may comprise a first filtering step using a low-pass filter having a first kernel and a second filtering step using a low-pass filter having a second kernel.
- the first kernel is preferably different from the second kernel.
- the kernels of the low-pass filter(s) may be Gaussian kernels.
- the first Gaussian kernel may have substantially the following characteristics: support 21 pixels, sigma of 1.0.
- the second Gaussian kernel may have substantially the following characteristics: support 21 pixels, sigma of 0.075.
- FIG. 3 is a print of the binary image after the image has been segmented.
- FIG. 4 shows the resulting image after the morphological open operation.
- a morphological open operation is a known, standard image processing technique.
- the image may still contain many objects which do not correspond to a the hair 13 of follicular unit 14 .
- noise filtering is then performed on the image resulting from the morphological open operation.
- the noise filtering removes objects which do not meet criteria corresponding to a follicular unit 14 .
- the area, location or orientation of the image of the follicular unit may be whose area, location or orientation do not correspond to hair.
- the object 22 appears to be much longer and have a much larger area than the other objects in the image 19 .
- this object is probably not a hair 13 and therefore should be filtered out of the image.
- the print of the image after the noise filtering step of FIG. 5 it can be seen that the object 22 has been filtered out of the image.
- the noise filtering step can filter based on a wide range of characteristics of the objects in the image, including without limitation, length, area, orientation and/or location. Whether the characteristics of an image of an object corresponds to hair may be determined by statistical comparison to the global nature of the same characteristics for images of objects in the selected image which are known to be hair, or alternatively, the characteristics can be compared to predetermined criteria based on patient sampling or other data. For instance, the noise filtering filter can be based on characteristics of a sampling of the other hairs on the body surface of the particular patient, or the characteristics of a sampling of hairs on a sample of patients, or on known predetermined data based on studies or research.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Human Computer Interaction (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Processing (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Surgical Instruments (AREA)
Abstract
A system and method for counting follicular units using an automated system comprises (i) acquiring a digital image of a body surface having skin and a plurality of follicular units; (ii) selecting a region of interest with the digital image; (iii) segmenting the selected image to produce a binary image; (iv) performing a morphological open operation on the binary image; and (v) performing noise filtering by removing objects having certain characteristics which do not correspond to hair.
Description
- This invention relates generally to hair transplantation procedures and more particularly to a system and method for counting follicular units using digital imaging and processing techniques for use in hair transplantation procedures.
- Hair transplantation procedures are well-known, and typically involve (in a patient having male pattern baldness) harvesting donor hair grafts from the side and back fringe areas (donor areas) of the patient's scalp, and implanting them in a bald area (recipient area). Historically, the harvested grafts were relatively large (3-5 mm), although more recently, the donor grafts may be single follicular units. In particular, “follicular units” (also referred to herein as FU or FUs) are naturally occurring aggregates of 1-3 (and much less commonly, 4-5) closely spaced hair follicles that are distributed randomly over the surface of the scalp.
- The follicular units may be classified, or “typed,” based on the number of hairs in the unit and identified in shorthand as an “F1” for a single hair follicular unit, an “F2” for a two hair follicular unit and so on for follicular units with 3-5 hairs. In some cases of multiple hair follicular units, the hairs may appear to emanate from a single follicle or point in the skin. In other cases, the hairs may exit the skin surface at slightly spaced apart positions, but converge into a single follicular unit beneath the skin. Referring to
FIG. 1 , a print of a digital image of an exemplary section of a human scalp 11 having a variety of types of follicular units is shown. For example, thefollicular unit 13 has two hairs and is therefore an F2, while follicular unit 15 is an F1 since it has only a single hair. Similarly, follicular unit 17 appears to be an F3 having three hairs. - There are several reasons it is important and desirable to count and classify follicular units in a region of interest on a body surface. For one, the number of follicular units can be used in the planning process for a transplantation procedure. For instance, if this number sets the limit on the number of follicular units in that area that can be harvested for transplantation. However, in many cases, the doctor may want to implant only a certain percentage of the follicular units available, thereby leaving some coverage in the area being harvested. In addition, in many hair restoration transplant procedures, certain classes of follicular units are preferred.
- As for classification, there are several reasons it is important and desirable to identify and classify follicular units based on the number of hairs in the follicular unit. It may be desirable to utilize a variety of classes (also referred to as “types”) of follicular units to provide the desired attributes for the appearance of the transplanted hair. Such attributes can include the density of hair, the direction or orientation of hair, the particular mix of types of follicular units, and/or the appearance of randomness, among other possible attributes. An example of the use of various types of follicular units is as follows. It is preferable to transplant certain classes of follicular units into specific regions of the scalp. For example, single hair follicular units (F1s) are commonly implanted along the hairline that frames the face. Follicular units with more than one hair (F2s, F3s, etc.) are commonly implanted in the mid-scalp and crown. This arrangement of follicular unit distribution is thought to produce a more natural appearing aesthetic result.
- Various procedures for hair transplantation have been previously disclosed, including both manual and mechanized to certain degrees of automation. In one well-known manual process, a linear portion of the scalp is removed from a donor area by dissection with a scalpel down into the fatty subcutaneous tissue. The strip is dissected (under a microscope) into the component follicular units, which are then implanted into a recipient area in respective puncture holes made by a needle. Forceps are typically used to grasp and place the follicular unit grafts into the needle puncture locations, although other instruments and methods are known for doing so.
- In “Androgenetic Alopecia” (Springer 1996), M. Inaba & Y. Inaba disclose and describe a manual method for harvesting singular follicular units by positioning a hollow punch needle having a cutting edge and interior lumen with a diameter of 1 mm, which is about equal to the diameter of critical anatomical parts of a follicular unit. The needle punch is axially aligned with an axis of a follicular unit to be extracted and then advanced into the scalp to cut the scalp about the circumference of the selected follicular unit. Thereafter, the follicular units are easily removed, e.g., using forceps, for subsequent implantation into a recipient site with a specially devised insertion needle.
- U.S. Pat. No. 6,585,746 discloses an automated hair transplantation system utilizing a robot, including a robotic arm and a hair follicle introducer associated with the robotic arm. A video system is used to produce a three-dimensional virtual image of the patient's scalp, which is used to plan the scalp locations that are to receive hair grafts implanted by the follicle introducer under the control of the robotic arm. The entire disclosure of U.S. Pat. No. 6,585,746 is incorporated herein by reference.
- U.S. Pat. No. 6,585,746 discloses an automated hair transplantation system utilizing a robot, including a robotic arm and a hair follicle introducer associated with the robotic arm. A video system is used to produce a three-dimensional virtual image of the patient's scalp, which is used to plan the scalp locations that are to receive hair grafts implanted by the follicle introducer under the control of the robotic arm. The entire disclosure of U.S. Pat. No. 6,585,746 is incorporated herein by reference.
- Automated systems and methods for transplanting are also disclosed in U.S. provisional patent application Ser. Nos. 60/722,521, filed Sep. 30, 2005, 60/753,602, filed Dec. 22, 2005, and 60/764,173, filed Jan. 31, 2006, and U.S. patent application Ser. Nos. 11/380,903, filed Apr. 28, 2006 and 11/380,907, filed Apr. 28, 2006. The foregoing applications are all hereby incorporated by reference into the present application in their entirety.
- For example, U.S. patent application Ser. No. 11/380,907, referenced above, the disclosed system comprises a robotic arm having a harvesting and/or implantation tool mounted on the arm. One or more cameras are also mounted on the arm and are used to image the work space, such as a body surface. A processor is configured to receive and process images acquired by the cameras. A controller is operatively coupled to the processor and the robotic arm. The controller controls the movement of the robotic arm based, at least in part, on the processed images acquired by the cameras and the processor. The arm is controllably moveable to position the tool at a desired orientation and position relative to the body surface to perform transplantation of hairs.
- In utilizing any of these systems and methods for hair transplantation, it is desirable to first plan the transplantation to select the follicular units to be harvested and transplanted and to determine the precise location where the hairs are to be implanted. Accordingly, in planning a hair transplantation procedure, specific follicular units from a specific location on a body surface may be selected for harvesting and transplantation into a different part of the body surface. The follicular units to be transplanted may be selected based on certain criteria, for example, the type of follicular unit (i.e. F1, F2, etc.), the orientation of the hair in the follicular unit, the density of the hair, etc. However, the process of counting, and characterizing each follicular unit can be tedious and time consuming. Therefore, there is a need for a system and method for counting and/or classifying follicular units using an automated system. A system and method for classifying follicular units is described in U.S. patent application Serial No. (not yet assigned), filed on or about Aug. 25, 2006, entitled SYSTEM AND METHOD FOR CLASSIFYING FOLLICULAR UNITS, the contents of which is incorporated by reference herein in its entirety.
- In accordance with a general aspect of the inventions disclosed herein, a system and method for counting follicular units using an automated system is provided. The system and method of the present invention may be utilized with systems and methods for transplantation of hair follicular units on a body surface, such as a human scalp. The system and method of the present invention is especially useful when implemented on, or integrated with, an automated system for hair transplantation.
- In one aspect of the present invention, the method of counting follicular units comprises first acquiring a digital image of a body surface having skin and follicular units. A region of interest within the digital image which is known to contain one or more follicular units (FU) is selected. This region of interest within the digital image is called the selected image. The selected image is digitally filtered using a band-pass filter to remove components in the image which correspond to the skin. The filtered image is then segmented to produce a binary image of the region of interest.
- A morphological open operation is performed on the binary image to further process the image in preparation for the subsequent noise filtering step. A morphological open operation is a standard image processing technique known by those of ordinary skill in the art. Noise filtering is then performed on the image resulting from the morphological open operation. The noise filtering removes objects which do not meet criteria corresponding to a follicular unit. For example, the area, location or orientation of an object in the image may be one whose area, location or orientation does not correspond to an actual follicular unit (it could be cut hair that happens to be remaining on the scalp, for example). Whether the characteristics of an image of an object corresponds to hair may be determined by statistical comparison to the global nature of the same characteristics for images of objects in the selected image which are known to be hair, or alternatively, the characteristics can be compared to predetermined criteria based on patient sampling or other data (e.g., if the patient parts the hair in a certain way we know that the hairs should mostly be pointing in a given direction).
- Each of the objects remaining in the image after the noise filtering is counted as a follicular unit. Thus, the method may be used to count follicular units.
- In another aspect of the method of counting follicular units, the step of digitally filtering the selected image using a band-pass filter may comprise a first filtering step using a low-pass filter having a first kernel and a second filtering step using a low-pass filter having a second kernel. In another feature of the present invention, the low-pass kernels may be Gaussian kernels. Those of ordinary skill in the art are familiar with, and understand how to implement, such low-pass filters and Gaussian filters.
- In another embodiment of the method of the present invention, the step of acquiring the digital image of the body surface includes a method for tracking the FU of interest and aligning the camera(s) to obtain the digital image. First and second cameras which provide stereo images. The stereo images may be used to track an FU of interest within the digital images of the first and second cameras to adjust for movement of the body surface and/or movement of the cameras. In addition, the first and second cameras are aligned with the general orientation of the hair of the FU. The stereo images may also be used to compute coordinate positions of the hairs. Then, images having a computed coordinate position which is inconsistent with a hair on said body surface can also be filtered out.
- The system for counting follicular units using an automated system may comprise any of the transplantation systems described in the background above. For instance, the system described in U.S. patent application Ser. No. 11/380,907 may be programmed and configured to perform the methods of counting follicular units according to the present invention. The cameras on the system can provide stereo digital images and the robotic arm can properly position and orient the cameras. The selection of a region of interest may be performed by an operator at the user interface of the system (such as a computer having a monitor and input devices) or it could be automated through programming of the computer and/or controller.
- Accordingly, a system and method for counting follicular units is provided. Other and further embodiments, objects and advantages of the invention will become apparent from the following detailed description when read in view of the accompanying figures.
- The invention is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements, and in which:
-
FIG. 1 is a print of a digital image of an exemplary section of a human scalp having a plurality of follicular units. -
FIG. 2 is a print of the digital image ofFIG. 1 after it has been filtered using a band-pass filter. -
FIG. 3 is a print of the digital image ofFIG. 2 after the image has been segmented. -
FIG. 4 is print of the digital image ofFIG. 3 after a morphological open operation has been performed on the segmented image. -
FIG. 5 is a print of the digital image ofFIG. 4 after noise filtering has been performed on the image. - Referring first to
FIG. 1 , the system and method for counting follicular units according to the present invention generally begins with acquiring a digital image 10 of a body surface 11 using one or more digital cameras. The body surface 11 has skin 12 and a plurality of follicular units 14 each having one or more hairs 13 (only a few of the follicular units 14 andhairs 13 are labeled in the figures). The photo ofFIG. 1 is an image of a section of human scalp 11, but it is understood that the body surface 11 could be any area of any body having hair. The digital image 10 shows a variety of types of follicular units 14 (FU) on the scalp 11. - The digital image 10 may be acquired using one or more digital cameras of an automated hair transplantation system, such as the cameras described in the hair transplantation system of U.S. patent application Ser. No. 11/380,907, which is incorporated by reference herein in its entirety. The image from just one of the cameras can be used to produce the digital image 10. Alternatively, the process for obtaining the digital image 10 may be acquired by a more involved process which aligns the camera(s) to improve the image used to classify a follicular unit of interest. In this process, a first camera and a second camera are used. The cameras are arranged and configured to obtain stereo images of a body surface at which they cameras are directed. The cameras are first positioned to be directed at the body surface in an area known to have hair. A first digital image is acquired from the first camera and a follicular unit (FU) of interest is selected from within the first digital image. A second digital image of about the same region of the body surface as the first camera (except from a slightly different angle as provided by stereo cameras) is acquired from the second camera and the same FU of interest is selected from within the second digital image. The FU of interest can be selected in the digital images by an operator of the system or automatically by the system using a selection algorithm. The transplantation system is now able to track the FU of interest within the first and second digital images from the first and second cameras. The tracking procedure can be used to adjust for movement of the body surface and movement of the cameras when they are aligned to acquire the digital image(s) used for classifying the FU. Next, the first and second cameras are moved and oriented to be aligned with the general orientation of the hair of the FU. As the cameras are moved, additional digital images may be acquired and processed by the system in order to track the FU of interest. By aligning the cameras with the hair of the FU, a better digital image for classifying the FU can be acquired. With the cameras in the desired alignment, the cameras acquire the digital images to be used in the next steps of the method of classifying a follicular unit.
- After the digital image 10 is acquired, a region of
interest 19 which could be the entire digital image 10 or a sub-area is selected. In the example described herein, the selected region ofinterest 19 is co-extensive with the digital image 10. However, the selected region ofinterest 19 can be any subset area of the digital image 10. The region ofinterest 19 may be selected by an operator or the selection may be automated by the system. This region of interest within the digital image is called the selected image. - Then, the selected
image 19 is digitally filtered using a band-pass filter to remove components in the image which correspond to the skin 12.FIG. 2 shows a print of the digital image after the original selected image has been filtered using a band-pass filter. The band-pass filter may comprise any suitable filter as known by those of ordinary skill in the art. For example, the selected image may be filtered using a band-pass filter. The band-pass filtering can be accomplished by low-pass filtering the selected image twice and then subtracting the two resulting filtered images. The band-pass filter may comprise a first filtering step using a low-pass filter having a first kernel and a second filtering step using a low-pass filter having a second kernel. The first kernel is preferably different from the second kernel. In one embodiment of the present invention, the kernels of the low-pass filter(s) may be Gaussian kernels. The first Gaussian kernel may have substantially the following characteristics: support 21 pixels, sigma of 1.0. The second Gaussian kernel may have substantially the following characteristics: support 21 pixels, sigma of 0.075. - Next, the resulting image after the band-pass filtering is segmented using well-known digital image processing techniques to produce a binary image of the selected image.
FIG. 3 is a print of the binary image after the image has been segmented. - Then, a morphological open operation is performed on the binary image to remove artifacts from the image.
FIG. 4 shows the resulting image after the morphological open operation. As stated above, a morphological open operation is a known, standard image processing technique. As can be seen inFIG. 4 , the image may still contain many objects which do not correspond to a thehair 13 of follicular unit 14. There are many objects which appear to be too long, too large, randomly oriented and/or in a location which probably does not contain hair. - Accordingly, noise filtering is then performed on the image resulting from the morphological open operation. The noise filtering removes objects which do not meet criteria corresponding to a follicular unit 14. For example, the area, location or orientation of the image of the follicular unit may be whose area, location or orientation do not correspond to hair. Referring back to
FIG. 4 , theobject 22 appears to be much longer and have a much larger area than the other objects in theimage 19. Thus, it can be assumed that this object is probably not ahair 13 and therefore should be filtered out of the image. Turning now to the print of the image after the noise filtering step ofFIG. 5 , it can be seen that theobject 22 has been filtered out of the image. The noise filtering step can filter based on a wide range of characteristics of the objects in the image, including without limitation, length, area, orientation and/or location. Whether the characteristics of an image of an object corresponds to hair may be determined by statistical comparison to the global nature of the same characteristics for images of objects in the selected image which are known to be hair, or alternatively, the characteristics can be compared to predetermined criteria based on patient sampling or other data. For instance, the noise filtering filter can be based on characteristics of a sampling of the other hairs on the body surface of the particular patient, or the characteristics of a sampling of hairs on a sample of patients, or on known predetermined data based on studies or research. - Any or all of the systems and methods for classifying a follicular unit as described herein may be used in conjunction with the system and method of harvesting and transplanting hair as described in U.S. patent application Ser. No. 11/380,903 and U.S. patent application Ser. No. 11/380,907.
- The foregoing illustrated and described embodiments of the invention are susceptible to various modifications and alternative forms, and it should be understood that the invention generally, as well as the specific embodiments described herein, are not limited to the particular forms or methods disclosed, but to the contrary cover all modifications, equivalents and alternatives falling within the scope of the appended claims. By way of non-limiting example, it will be appreciated by those skilled in the art that the invention is not limited to the use of a robotic system including a robotic arm, and that other automated and semi-automated systems may be utilized. Moreover, the system and method of counting follicular units of the present invention can be a separate system used along with a separate automated transplantation system or even with a manual transplantation procedure.
Claims (15)
1. A method of identifying follicular units on a body surface having skin and hair, comprising:
acquiring a digital image of a body surface;
selecting a region of interest within said digital image called a selected image;
digital filtering said selected image, via a band-pass filter, to remove components corresponding to the skin;
segmenting said selected image to produce a binary image;
performing morphological open operation on said binary image; and
performing noise filtering by removing objects having certain characteristics which do not correspond to such characteristics of hair.
2. The method of claim 1 , wherein said step of filtering said selected image via a band-pass filter comprises low-pass filtering the selected image twice and then subtracting the two resulting filtered images.
3. The method of claim 1 , wherein said step of filtering said selected image via a band-pass filter comprises a first filter step in which said selected image is filtered using a low-pass filter having a first kernel and a second filter step in which said selected image is filtered using a low-pass filter having a second kernel.
4. The method of claim 3 , wherein said first kernel is a Gaussian kernel having substantially the following characteristics, support 21 pixels, sigma of 1.0.
5. The method of claim 3 , wherein said second kernel is a Gaussian kernel having substantially the following characteristics, support 21 pixels, sigma of 0.75.
6. The method of claim 3 wherein said first kernel is a Gaussian kernel having substantially the following characteristics, support 21 pixels, sigma of 1.0 and said second kernel is a Gaussian kernel having substantially the following characteristics, support 21 pixels, sigma of 0.75.
7. The method of claim 1 , further comprising the following steps:
acquiring a second digital image in stereo correspondence to said first digital image;
computing the coordinate position of a hair using said first and second digital images; and
filtering out images having a computed coordinate position which is inconsistent with a hair on said body surface.
8. The method of claim 1 , further comprising counting the discrete images of hairs using said processed image.
9. The method of claim 1 , wherein said step of noise filtering comprises filtering out any object having an area that differs by more than two standard deviations from the mean object size within the selected image.
10. The method of claim 1 , wherein said step of noise filtering comprises filtering out any image with an area that is larger than two standard deviations larger than the mean object size.
11. The method of claim 1 , wherein said characteristics include one or more of area, location and orientation.
12. The method of claim 1 , wherein said step of noise filtering comprises filtering out objects whose characteristics do not correspond to such characteristics for the other objects in the region of interest.
13. The method of claim 1 , wherein said step of noise filtering comprises filtering out objects whose characteristics do not correspond to such characteristics for hair based on a sampling of hairs on the body surface.
14. The method of claim 1 , wherein said step of noise filtering comprises filtering out objects whose characteristics do not correspond to such characteristics expected for hairs based on predetermined data.
15. The method of claim 3 wherein said first kernel is different from said second kernel.
Priority Applications (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/467,283 US20080049993A1 (en) | 2006-08-25 | 2006-08-25 | System and method for counting follicular units |
CA2661660A CA2661660C (en) | 2006-08-25 | 2007-08-24 | System and method for counting follicular units |
US12/162,604 US8199983B2 (en) | 2006-08-25 | 2007-08-24 | System and method for counting follicular units |
CN2007800310595A CN101505659B (en) | 2006-08-25 | 2007-08-24 | System and method for counting follicular units |
EP07841322A EP2053973A4 (en) | 2006-08-25 | 2007-08-24 | System and method for counting follicular units |
JP2009525790A JP4988847B2 (en) | 2006-08-25 | 2007-08-24 | System and method for counting hair follicle units |
AU2007286606A AU2007286606B2 (en) | 2006-08-25 | 2007-08-24 | System and method for counting follicular units |
KR1020097003097A KR101030853B1 (en) | 2006-08-25 | 2007-08-24 | How to count hair follicle units, systems, image processors and how to count and classify hair follicle units |
BRPI0715630-8A BRPI0715630A2 (en) | 2006-08-25 | 2007-08-24 | Follicular Unit Counting System and Method |
PCT/US2007/076728 WO2008024955A2 (en) | 2006-08-25 | 2007-08-24 | System and method for counting follicular units |
US13/472,631 US8290229B2 (en) | 2006-08-25 | 2012-05-16 | System and method for counting follicular units |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/467,283 US20080049993A1 (en) | 2006-08-25 | 2006-08-25 | System and method for counting follicular units |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080049993A1 true US20080049993A1 (en) | 2008-02-28 |
Family
ID=39107713
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/467,283 Abandoned US20080049993A1 (en) | 2006-08-25 | 2006-08-25 | System and method for counting follicular units |
US12/162,604 Active 2030-04-01 US8199983B2 (en) | 2006-08-25 | 2007-08-24 | System and method for counting follicular units |
US13/472,631 Active US8290229B2 (en) | 2006-08-25 | 2012-05-16 | System and method for counting follicular units |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/162,604 Active 2030-04-01 US8199983B2 (en) | 2006-08-25 | 2007-08-24 | System and method for counting follicular units |
US13/472,631 Active US8290229B2 (en) | 2006-08-25 | 2012-05-16 | System and method for counting follicular units |
Country Status (9)
Country | Link |
---|---|
US (3) | US20080049993A1 (en) |
EP (1) | EP2053973A4 (en) |
JP (1) | JP4988847B2 (en) |
KR (1) | KR101030853B1 (en) |
CN (1) | CN101505659B (en) |
AU (1) | AU2007286606B2 (en) |
BR (1) | BRPI0715630A2 (en) |
CA (1) | CA2661660C (en) |
WO (1) | WO2008024955A2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090306498A1 (en) * | 2008-06-06 | 2009-12-10 | Restoration Robotics, Inc. | Systems and Methods for Improving Follicular Unit Harvesting |
US20100080415A1 (en) * | 2008-09-29 | 2010-04-01 | Restoration Robotics, Inc. | Object-tracking systems and methods |
US9576359B2 (en) * | 2013-11-01 | 2017-02-21 | The Florida International University Board Of Trustees | Context based algorithmic framework for identifying and classifying embedded images of follicle units |
Families Citing this family (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3117768B1 (en) | 2006-05-19 | 2019-11-06 | The Queen's Medical Center | Motion tracking system and method for real time adaptive imaging and spectroscopy |
US7477782B2 (en) * | 2006-08-25 | 2009-01-13 | Restoration Robotics, Inc. | System and method for classifying follicular units |
US8211134B2 (en) | 2007-09-29 | 2012-07-03 | Restoration Robotics, Inc. | Systems and methods for harvesting, storing, and implanting hair grafts |
US8152827B2 (en) | 2008-01-11 | 2012-04-10 | Restoration Robotics, Inc. | Systems and methods for harvesting, storing, and implanting hair grafts |
US8652186B2 (en) | 2008-06-04 | 2014-02-18 | Restoration Robotics, Inc. | System and method for selecting follicular units for harvesting |
US9107697B2 (en) | 2008-06-04 | 2015-08-18 | Restoration Robotics, Inc. | System and method for selecting follicular units for harvesting |
US9314082B2 (en) | 2009-09-17 | 2016-04-19 | Pilofocus, Inc. | System and method for extraction of hair follicle |
IN2012DN02415A (en) | 2009-09-17 | 2015-08-21 | Carlos K Weskley | |
US9693799B2 (en) | 2009-09-17 | 2017-07-04 | Pilofocus, Inc. | System and method for aligning hair follicle |
US9498289B2 (en) | 2010-12-21 | 2016-11-22 | Restoration Robotics, Inc. | Methods and systems for directing movement of a tool in hair transplantation procedures |
US8911453B2 (en) | 2010-12-21 | 2014-12-16 | Restoration Robotics, Inc. | Methods and systems for directing movement of a tool in hair transplantation procedures |
US8951266B2 (en) * | 2011-01-07 | 2015-02-10 | Restoration Robotics, Inc. | Methods and systems for modifying a parameter of an automated procedure |
KR101194773B1 (en) * | 2011-03-29 | 2012-10-26 | 강진수 | Measurement method of alopecia progression degree |
US8945150B2 (en) | 2011-05-18 | 2015-02-03 | Restoration Robotics, Inc. | Systems and methods for selecting a desired quantity of follicular units |
US9606209B2 (en) | 2011-08-26 | 2017-03-28 | Kineticor, Inc. | Methods, systems, and devices for intra-scan motion correction |
CN104114110A (en) | 2011-10-17 | 2014-10-22 | 皮洛福克斯有限公司 | Hair restoration |
US8897522B2 (en) * | 2012-05-30 | 2014-11-25 | Xerox Corporation | Processing a video for vascular pattern detection and cardiac function analysis |
US20140028822A1 (en) * | 2012-07-30 | 2014-01-30 | Alex A. Khadavi | Hair loss monitor |
US9305365B2 (en) | 2013-01-24 | 2016-04-05 | Kineticor, Inc. | Systems, devices, and methods for tracking moving targets |
US10327708B2 (en) | 2013-01-24 | 2019-06-25 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9717461B2 (en) | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
WO2014120734A1 (en) | 2013-02-01 | 2014-08-07 | Kineticor, Inc. | Motion tracking system for real time adaptive motion compensation in biomedical imaging |
US8983157B2 (en) * | 2013-03-13 | 2015-03-17 | Restoration Robotics, Inc. | System and method for determining the position of a hair tail on a body surface |
CN106572810A (en) | 2014-03-24 | 2017-04-19 | 凯内蒂科尔股份有限公司 | Systems, methods, and devices for removing prospective motion correction from medical imaging scans |
WO2016014718A1 (en) | 2014-07-23 | 2016-01-28 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US10387021B2 (en) | 2014-07-31 | 2019-08-20 | Restoration Robotics, Inc. | Robotic hair transplantation system with touchscreen interface for controlling movement of tool |
US9943247B2 (en) | 2015-07-28 | 2018-04-17 | The University Of Hawai'i | Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan |
US10013642B2 (en) | 2015-07-30 | 2018-07-03 | Restoration Robotics, Inc. | Systems and methods for hair loss management |
US10568564B2 (en) | 2015-07-30 | 2020-02-25 | Restoration Robotics, Inc. | Systems and methods for hair loss management |
WO2017091479A1 (en) | 2015-11-23 | 2017-06-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
WO2017165191A2 (en) * | 2016-03-23 | 2017-09-28 | The Procter & Gamble Company | Imaging method for determining stray fibers |
US20180189976A1 (en) * | 2016-12-29 | 2018-07-05 | Michal Kasprzak | Analysis unit and system for assessment of hair condition |
US9980649B1 (en) | 2017-02-15 | 2018-05-29 | International Business Machines Corporation | Skin scanning device with hair orientation and view angle changes |
KR101800365B1 (en) | 2017-08-09 | 2017-11-22 | 김태희 | Automatic hair follicle separation device |
KR102004724B1 (en) * | 2017-11-30 | 2019-07-29 | (주)인트인 | Method for detecting sperm and record media recorded program for realizing the same |
KR102234006B1 (en) * | 2018-11-27 | 2021-03-31 | 김태희 | Method and collect hair information |
KR102324240B1 (en) * | 2019-01-28 | 2021-11-11 | 주식회사 아프스 | Hair Identifying Device and Apparatus for Automatically Separating Hair Follicles Including the Same |
CN114761994A (en) * | 2019-12-09 | 2022-07-15 | 瑞典爱立信有限公司 | Joint visual object detection and object mapping to 3D models |
WO2021215749A1 (en) * | 2020-04-21 | 2021-10-28 | (주)에임즈 | Hair information provision method and device, and hair root analysis model generation method therefor |
KR102329641B1 (en) * | 2020-04-21 | 2021-11-22 | (주)에임즈 | Method of analyzing hair condition and apparatus and system of providing hair information |
CN112419355A (en) * | 2020-11-25 | 2021-02-26 | 复旦大学 | Hair Counting Method in Hair Image Based on Corner Detection |
US11741600B2 (en) | 2021-04-21 | 2023-08-29 | Santosh Sharad Katekari | Identifying follicular units |
IL312624A (en) * | 2021-11-08 | 2024-07-01 | Spider Medical Ltd | System and methods for performing a remote hair analysis |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4807163A (en) * | 1985-07-30 | 1989-02-21 | Gibbons Robert D | Method and apparatus for digital analysis of multiple component visible fields |
JPH03218735A (en) * | 1989-10-27 | 1991-09-26 | Kanebo Ltd | Hair growth measuring method |
JP2783033B2 (en) * | 1992-01-13 | 1998-08-06 | 日本電気株式会社 | Method and apparatus for extracting area of color image |
US5331472A (en) | 1992-09-14 | 1994-07-19 | Rassman William R | Method and apparatus for measuring hair density |
US5782851A (en) | 1996-04-10 | 1998-07-21 | Rassman; William R. | Hair transplantation system |
US6517004B2 (en) | 1995-12-18 | 2003-02-11 | Metrologic Instruments, Inc. | Automated system for identifying and dimensioning packages transported through a laser scanning tunnel using laser scanning beam indexing techniques |
IT1292287B1 (en) | 1997-04-24 | 1999-01-29 | Roberto Luigi Costanzo | SURGICAL INSTRUMENT FOR AUTOLOGICAL HAIR TRANSPLANTATION. |
US5999639A (en) * | 1997-09-04 | 1999-12-07 | Qualia Computing, Inc. | Method and system for automated detection of clustered microcalcifications from digital mammograms |
US5895403A (en) | 1997-10-17 | 1999-04-20 | Collinsworth; Lonnie Rae | Surgical cutting tool |
US6973931B1 (en) | 1997-10-30 | 2005-12-13 | King Christopher R | Automated hair isolation and processing system |
US6720988B1 (en) | 1998-12-08 | 2004-04-13 | Intuitive Surgical, Inc. | Stereo imaging system and method for use in telerobotic systems |
WO2000064379A1 (en) * | 1999-04-23 | 2000-11-02 | Gildenberg Philip L | Hair transplantation method and apparatus |
FR2802678B1 (en) | 1999-12-21 | 2002-02-08 | Oreal | SYSTEM AND METHOD FOR FORECAST ANALYSIS AND SIMULATION OF THE TEMPORAL EVOLUTION OF A HAIRY ZONE, AND MORE PARTICULARLY OF HUMAN SCALP |
US6585746B2 (en) | 2000-04-20 | 2003-07-01 | Philip L. Gildenberg | Hair transplantation method and apparatus |
US7127081B1 (en) | 2000-10-12 | 2006-10-24 | Momentum Bilgisayar, Yazilim, Danismanlik, Ticaret, A.S. | Method for tracking motion of a face |
JP3966819B2 (en) | 2001-05-24 | 2007-08-29 | キム,スーギョン | Novel second keratinocyte growth factor analogue present in the hair follicle |
US7217266B2 (en) | 2001-05-30 | 2007-05-15 | Anderson R Rox | Apparatus and method for laser treatment with spectroscopic feedback |
US20030095707A1 (en) | 2001-11-19 | 2003-05-22 | Koninklijke Philips Electronics N.V. | Computer vision method and system for blob-based analysis using a probabilistic pramework |
US6949115B2 (en) | 2003-10-14 | 2005-09-27 | Southland Instruments, Inc. | Polarized light analyzer |
JP2006051210A (en) * | 2004-08-12 | 2006-02-23 | Japan Research Institute Ltd | Hair density prediction apparatus, hair density prediction method, and prolam |
WO2006047502A2 (en) | 2004-10-25 | 2006-05-04 | Brigham And Women's Hospital | Automated segmentation, classification and tracking of cell nuclei in time-lapse microscopy |
EP1653219A1 (en) | 2004-10-26 | 2006-05-03 | The Procter & Gamble Company | Method and apparatus for examining human hairs |
KR20050050061A (en) * | 2005-04-30 | 2005-05-27 | 조동욱 | Facial features extraction for ocular inspection |
US20070078466A1 (en) | 2005-09-30 | 2007-04-05 | Restoration Robotics, Inc. | Methods for harvesting follicular units using an automated system |
US7962192B2 (en) | 2005-09-30 | 2011-06-14 | Restoration Robotics, Inc. | Systems and methods for aligning a tool with a desired location or object |
JP2007260038A (en) * | 2006-03-28 | 2007-10-11 | Taisho Pharmaceut Co Ltd | Method for measuring the number of hairs and method for evaluating the effect of hair growth agents |
JP2009537231A (en) | 2006-05-19 | 2009-10-29 | マコ サージカル コーポレーション | Method and apparatus for controlling a haptic device |
US7620144B2 (en) | 2006-06-28 | 2009-11-17 | Accuray Incorporated | Parallel stereovision geometry in image-guided radiosurgery |
US20080033455A1 (en) | 2006-08-03 | 2008-02-07 | Rassman William R | Hair extraction device and method for its use |
US7477782B2 (en) | 2006-08-25 | 2009-01-13 | Restoration Robotics, Inc. | System and method for classifying follicular units |
US8115807B2 (en) | 2007-03-06 | 2012-02-14 | William Rassman | Apparatus and method for mapping hair metric |
-
2006
- 2006-08-25 US US11/467,283 patent/US20080049993A1/en not_active Abandoned
-
2007
- 2007-08-24 US US12/162,604 patent/US8199983B2/en active Active
- 2007-08-24 EP EP07841322A patent/EP2053973A4/en not_active Withdrawn
- 2007-08-24 JP JP2009525790A patent/JP4988847B2/en active Active
- 2007-08-24 BR BRPI0715630-8A patent/BRPI0715630A2/en not_active Application Discontinuation
- 2007-08-24 CA CA2661660A patent/CA2661660C/en active Active
- 2007-08-24 CN CN2007800310595A patent/CN101505659B/en active Active
- 2007-08-24 WO PCT/US2007/076728 patent/WO2008024955A2/en active Application Filing
- 2007-08-24 AU AU2007286606A patent/AU2007286606B2/en not_active Ceased
- 2007-08-24 KR KR1020097003097A patent/KR101030853B1/en active IP Right Grant
-
2012
- 2012-05-16 US US13/472,631 patent/US8290229B2/en active Active
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090306498A1 (en) * | 2008-06-06 | 2009-12-10 | Restoration Robotics, Inc. | Systems and Methods for Improving Follicular Unit Harvesting |
US8545517B2 (en) * | 2008-06-06 | 2013-10-01 | Restoration Robotics, Inc. | Systems and methods for improving follicular unit harvesting |
US20100080415A1 (en) * | 2008-09-29 | 2010-04-01 | Restoration Robotics, Inc. | Object-tracking systems and methods |
US8848974B2 (en) * | 2008-09-29 | 2014-09-30 | Restoration Robotics, Inc. | Object-tracking systems and methods |
US9405971B2 (en) | 2008-09-29 | 2016-08-02 | Restoration Robotics, Inc. | Object-Tracking systems and methods |
US9589368B2 (en) | 2008-09-29 | 2017-03-07 | Restoration Robotics, Inc. | Object-tracking systems and methods |
US9576359B2 (en) * | 2013-11-01 | 2017-02-21 | The Florida International University Board Of Trustees | Context based algorithmic framework for identifying and classifying embedded images of follicle units |
Also Published As
Publication number | Publication date |
---|---|
KR101030853B1 (en) | 2011-04-22 |
US8199983B2 (en) | 2012-06-12 |
BRPI0715630A2 (en) | 2013-07-02 |
CA2661660C (en) | 2013-07-16 |
CN101505659A (en) | 2009-08-12 |
US20120230561A1 (en) | 2012-09-13 |
WO2008024955A2 (en) | 2008-02-28 |
JP4988847B2 (en) | 2012-08-01 |
AU2007286606B2 (en) | 2010-07-29 |
US8290229B2 (en) | 2012-10-16 |
US20090052738A1 (en) | 2009-02-26 |
CN101505659B (en) | 2011-09-28 |
AU2007286606A1 (en) | 2008-02-28 |
JP2010501288A (en) | 2010-01-21 |
EP2053973A2 (en) | 2009-05-06 |
WO2008024955A3 (en) | 2008-06-12 |
EP2053973A4 (en) | 2012-08-01 |
WO2008024955A9 (en) | 2008-04-10 |
CA2661660A1 (en) | 2008-02-28 |
KR20090040336A (en) | 2009-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080049993A1 (en) | System and method for counting follicular units | |
US7477782B2 (en) | System and method for classifying follicular units | |
US9107697B2 (en) | System and method for selecting follicular units for harvesting | |
CN103269657B (en) | For revising the method and system of the parameter of automation process |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RESTORATION ROBOTICS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:QURESHI, SHEHRZAD A.;BODDULURI, MOHAN;REEL/FRAME:018402/0117 Effective date: 20061017 |
|
STCB | Information on status: application discontinuation |
Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION |