WO2019039595A1 - Dispositif et procédé d'estimation de catégorie de cytodiagnostic du col de l'utérus - Google Patents
Dispositif et procédé d'estimation de catégorie de cytodiagnostic du col de l'utérus Download PDFInfo
- Publication number
- WO2019039595A1 WO2019039595A1 PCT/JP2018/031409 JP2018031409W WO2019039595A1 WO 2019039595 A1 WO2019039595 A1 WO 2019039595A1 JP 2018031409 W JP2018031409 W JP 2018031409W WO 2019039595 A1 WO2019039595 A1 WO 2019039595A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- cell
- category
- stained
- estimation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 58
- 210000003679 cervix uteri Anatomy 0.000 title claims description 24
- 238000000605 extraction Methods 0.000 claims abstract description 66
- 239000000284 extract Substances 0.000 claims abstract description 7
- 239000000203 mixture Substances 0.000 claims description 55
- 230000000877 morphologic effect Effects 0.000 claims description 21
- 210000004369 blood Anatomy 0.000 claims description 13
- 239000008280 blood Substances 0.000 claims description 13
- 238000001514 detection method Methods 0.000 claims description 13
- 206010061218 Inflammation Diseases 0.000 claims description 12
- 238000011109 contamination Methods 0.000 claims description 12
- 230000004054 inflammatory process Effects 0.000 claims description 12
- 238000003745 diagnosis Methods 0.000 abstract description 8
- 210000004027 cell Anatomy 0.000 description 257
- 238000003860 storage Methods 0.000 description 26
- 206010008342 Cervix carcinoma Diseases 0.000 description 12
- 208000006105 Uterine Cervical Neoplasms Diseases 0.000 description 12
- 201000010881 cervical cancer Diseases 0.000 description 12
- 238000004891 communication Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 9
- 206010008263 Cervical dysplasia Diseases 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 238000010186 staining Methods 0.000 description 8
- 230000027455 binding Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 7
- 210000000981 epithelium Anatomy 0.000 description 3
- 238000002372 labelling Methods 0.000 description 3
- 238000012216 screening Methods 0.000 description 3
- 239000000126 substance Substances 0.000 description 3
- 101710132601 Capsid protein Proteins 0.000 description 2
- 101001133056 Homo sapiens Mucin-1 Proteins 0.000 description 2
- 241000701806 Human papillomavirus Species 0.000 description 2
- 102100034256 Mucin-1 Human genes 0.000 description 2
- 108010063954 Mucins Proteins 0.000 description 2
- 102000015728 Mucins Human genes 0.000 description 2
- 208000007951 cervical intraepithelial neoplasia Diseases 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000002380 cytological effect Effects 0.000 description 2
- 230000000120 cytopathologic effect Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 210000002919 epithelial cell Anatomy 0.000 description 2
- 230000003902 lesion Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 210000003097 mucus Anatomy 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000001568 sexual effect Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 208000007879 Atypical Squamous Cells of the Cervix Diseases 0.000 description 1
- 101001133081 Homo sapiens Mucin-2 Proteins 0.000 description 1
- 101000972284 Homo sapiens Mucin-3A Proteins 0.000 description 1
- 101000972286 Homo sapiens Mucin-4 Proteins 0.000 description 1
- 101000972282 Homo sapiens Mucin-5AC Proteins 0.000 description 1
- 101000972276 Homo sapiens Mucin-5B Proteins 0.000 description 1
- 101000972278 Homo sapiens Mucin-6 Proteins 0.000 description 1
- 101000972273 Homo sapiens Mucin-7 Proteins 0.000 description 1
- 102100034263 Mucin-2 Human genes 0.000 description 1
- 102100022497 Mucin-3A Human genes 0.000 description 1
- 102100022693 Mucin-4 Human genes 0.000 description 1
- 102100022496 Mucin-5AC Human genes 0.000 description 1
- 102100022494 Mucin-5B Human genes 0.000 description 1
- 102100022493 Mucin-6 Human genes 0.000 description 1
- 102100022492 Mucin-7 Human genes 0.000 description 1
- 108090000526 Papain Proteins 0.000 description 1
- 208000009608 Papillomavirus Infections Diseases 0.000 description 1
- 239000004365 Protease Substances 0.000 description 1
- 238000005299 abrasion Methods 0.000 description 1
- 239000000427 antigen Substances 0.000 description 1
- 102000036639 antigens Human genes 0.000 description 1
- 108091007433 antigens Proteins 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000007621 cluster analysis Methods 0.000 description 1
- 238000004138 cluster model Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000003054 hormonal effect Effects 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229940051875 mucins Drugs 0.000 description 1
- 210000000440 neutrophil Anatomy 0.000 description 1
- 229940055729 papain Drugs 0.000 description 1
- 235000019834 papain Nutrition 0.000 description 1
- 230000009870 specific binding Effects 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/48—Biological material, e.g. blood, urine; Haemocytometers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/48—Biological material, e.g. blood, urine; Haemocytometers
- G01N33/50—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
- G01N33/53—Immunoassay; Biospecific binding assay; Materials therefor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/48—Biological material, e.g. blood, urine; Haemocytometers
- G01N33/50—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
- G01N33/53—Immunoassay; Biospecific binding assay; Materials therefor
- G01N33/574—Immunoassay; Biospecific binding assay; Materials therefor for cancer
Definitions
- the present invention relates to an estimation apparatus and estimation method for cervical cytology categories.
- cytology In cervical cancer screening, it is common to first perform cytology prior to diagnosis by a pathologist. This cytology is known as one of the few screening methods that has been shown to reduce mortality.
- papain crow Pap. Stained specimens are generally prepared for specimens obtained by abrasion from the cervix. Then, a cytologist searches for epithelial cells of human papilloma virus (HPV) infection taking a characteristic form by microscopic observation of the stained specimen, and the degree of progression of precancerous lesions etc. Classified according to the system.
- the pathologist determines the risk of cervical cancer of the patient, and further performs a detailed examination such as a tissue diagnosis to make a final diagnosis. It is considered useful to treat patients by early detection of precancerous lesions by cytology before diagnosis by such a doctor.
- the age and examination period for examinations will be every 20 years from the age of 20, and the examination rate is lower at around 20%, but the number of examinations is compared with 10 years ago And it has increased 2.5 times.
- the majority of cervical cancer is due to HPV infection caused by sexual intercourse, the tendency of sexual activity to become younger, and so on, it is expected that the screening for cervical cancer will further increase in the future.
- the examination of cervical cancer by cytology is a manual judgment by a cytologist who is an expert, so labor and cost are required, and the cost of examination for the municipality is a heavy burden.
- the epithelial cells to be targeted for cytology may exhibit various forms depending on age, presence or absence of inflammation, hormonal environment, etc., and therefore, the determination may be difficult.
- it since it is human power, it requires skill and judgment may differ depending on the cytologist.
- the present invention aims to provide a new system and method capable of easily estimating cytological classification for cervical specimens, for example, prior to diagnosis of the possibility of cervical cancer by a pathologist. .
- the estimation apparatus of the present invention is an estimation apparatus for cervical cytology category, Sample image input unit, Cell image extraction unit, A cell shape classification unit, and a cytology category estimation unit;
- the sample image input unit Input the sample image of the stained sample,
- the stained sample is a cervical sample stained by a MUC conjugate,
- the cell image extraction unit Extracting a cell image of stained cells from the sample image;
- the cell shape classification unit Based on cell morphology classification information in which cell morphology information is associated with each of two or more morphology categories, the stained cells in the cell image are classified into corresponding morphology categories,
- the composition pattern of the morphological category of stained cells in the cell image extracted from the sample image is determined for the sample image;
- the cytology category estimation unit The sample image from the composition pattern of the sample image based on cytology category estimation information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories It is characterized in that
- the estimation method of the present invention is an estimation method of cytology category of cervix, Sample image input process, Cell image extraction process, Including a cell shape classification step and a cytology category estimation step;
- In the sample image input process Input the sample image of the stained sample,
- the stained sample is a cervical sample stained by a MUC conjugate,
- the cell image extraction step Extracting a cell image of stained cells from the sample image;
- the cell shape classification step Based on cell morphology classification information in which cell morphology information is associated with each of two or more morphology categories, the stained cells in the cell image are classified into corresponding morphology categories,
- the composition pattern of the morphological category of stained cells in the cell image extracted from the sample image is determined for the sample image;
- the cytology category estimation step is The sample image from the composition pattern of the sample image based on cytology category estimation information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology
- the program of the present invention is characterized by causing a computer to execute the method of the present invention for estimating the cytology category of the cervix.
- the recording medium of the present invention is a computer readable recording medium in which the program of the present invention is recorded.
- cytopathic classification estimation can be easily performed on cervical specimens prior to a pathologist's diagnosis of the morbidity of cervical cancer.
- FIG. 1 is a block diagram showing an example of an estimation apparatus of the present invention.
- FIG. 2 is a block diagram showing an example of the hardware configuration of the estimation apparatus of the present invention.
- FIG. 3 is a flow chart showing an example of a part of the estimation method of the present invention.
- FIG. 4 is a flowchart showing an example of a part of the estimation method of the present invention.
- FIG. 5 is a photograph showing an example of a sample image.
- FIG. 6 is a flowchart showing an example of extracting stained cells from a sample image.
- FIG. 7 is a conceptual view showing an example of labeling from a sample image.
- FIG. 8 is a flowchart showing an example of selecting a region with many stained pixels from the label.
- FIG. 1 is a block diagram showing an example of an estimation apparatus of the present invention.
- FIG. 2 is a block diagram showing an example of the hardware configuration of the estimation apparatus of the present invention.
- FIG. 3 is a flow chart showing an example of
- FIG. 9 is a photograph showing an example of classification of stained cells.
- FIG. 10 is a conceptual diagram showing an example of the relationship between the cytological classification of the Bethesda system and the classification ratio of stained cells.
- FIG. 11 is a flowchart showing an example of a method of generating a cell shape estimation model.
- the cytology category is a category of Bethesda system.
- the estimation apparatus of the present invention is, for example, a cell morphology estimation model in which the cell morphology information estimates a corresponding morphology category for cells in a cell image
- the cell shape estimation model is a model generated by learning from a learning stained cell image of the cervix corresponding to each of the shape categories
- the cell shape classification unit classifies the stained cells in the cell image into corresponding morphology categories according to the cell shape estimation model.
- the estimation apparatus of the present invention is, for example, a cytology category estimation model in which the cytology category estimation information estimates a corresponding cytology category for a composition pattern of cells of a sample image
- the cytology category estimation model is a model generated by learning from composition patterns of morphological cells of stained cells in the cervix corresponding to the respective cytology categories
- the cytology category estimation unit estimates a cytology category of the stained sample of the sample image by the cytology category estimation model.
- the cell image extraction unit extracts a cell image of stained cells from the sample image using a cell image extraction model
- the cell image extraction model is a model generated by learning from a stained cell image in a cervical stained sample image for learning.
- the number of the form categories is 5 to 100.
- the estimation apparatus of the present invention further includes, for example, an appropriate determination unit of the sample image,
- the suitability determination unit detects a determination item for the sample image, and determines that the determination result is appropriate as the sample image in the cell image extraction unit when the detection result satisfies the appropriate result.
- the determination item is at least one selected from the group consisting of presence or absence of image blurring, number of cells, degree of blood contamination, and degree of inflammation in the sample image.
- the cytology category is a category of Bethesda system.
- the estimation method of the present invention is, for example, a morphology category estimation model in which the cell morphology classification information estimates a corresponding morphology category for cells in a cell image
- the cell shape estimation model is a model generated by learning from a learning stained cell image of the cervix corresponding to each of the shape categories
- the cell shape classification unit classifies the stained cells in the cell image into corresponding morphology categories according to the cell shape estimation model.
- the estimation method of the present invention is, for example, a cytology category estimation model in which the cytology category estimation information estimates a corresponding cytology category for a composition pattern of cells of a sample image
- the cytology category estimation model is a model generated by learning from composition patterns of morphological cells of stained cells in the cervix corresponding to the respective cytology categories
- a cytology category is estimated for a stained sample of the sample image by the cytology category estimation model.
- the cell image extraction step extracts a cell image of stained cells from the sample image using a cell image extraction model;
- the cell image extraction model is a model generated by learning from a stained cell image in a cervical stained sample image for learning.
- the number of morphology categories is 5 to 100.
- the estimation method of the present invention further includes, for example, an appropriate determination unit of the sample image,
- the aptitude determining step detects a determination item for the sample image, and when the detection result satisfies the appropriate result, determines that the aptitude image is suitable as the sample image in the cell image extracting step.
- the determination item is at least one selected from the group consisting of presence or absence of image blurring, cell number, degree of blood contamination, and degree of inflammation in the sample image.
- MUC is a core protein of mucin and is known as a component of mucus.
- cervical epithelium does not have mucus, it is technical common knowledge that there is no MUC in cervical epithelium.
- the present inventors specifically isolated MUCs whose expression is not confirmed in normal epithelia because they are specifically expressed. By detecting the presence of MUC, it was found that the possibility of cervical cancer morbidity can be tested.
- the present inventor further uses the specimen image stained by the MUC conjugate of the cervix specimen from the relationship between MUC and cervical cancer as described above, and uses the stained specimen image of the stained specimen image.
- composition patterns of morphological categories allow estimation of cytology categories such as Bethesda system.
- the cervical specimen is stained with the MUC conjugate
- the morphology of the stained cells is classified, it becomes clear that the composition pattern of the morphology is different depending on the condition of the cervix.
- a doctor such as a pathologist
- even a person who does not have a medical license does not have a cytology category indirectly from the composition pattern of the form of the stained cells. It is possible to easily estimate whether it is likely to be applicable or not.
- the type of cytology for estimating the category is not particularly limited, and examples thereof include Bethesda system (also referred to as Bethesda classification) and the like.
- the cytology category for estimating the category is not limited to, for example, cytology known at the time of filing of the present application such as the Bethesda system and at the time of basic application of the present application. Also available.
- the cytology category means, for example, categories classified in a certain cytology (for example, including the meaning of class, level, stage, etc.).
- the category is, for example, a category (NILM, ASC-US, ASC-H, LSIL, HSIL, SCC) as shown in Table 1 described later.
- the Bethesda system is shown as an example of the cytology category. According to the Bethesda system, cells are classified into six categories as shown in Table 1 below. According to the present invention, for example, as described later, the cytology category by the Bethesda system can be estimated from the composition pattern of the morphological category of stained cells in the sample image.
- MUC is a core protein family of mucins, and examples thereof include MUC1, MUC2, MUC3, MUC4, MUC5AC, MUC5B, MUC6, MUC7 and the like.
- MUC may be, for example, any one type or two or more types, and among them, MUC1 is preferable.
- the type of MUC conjugate is not particularly limited, and it is a binding substance having binding activity to MUC, preferably a binding substance exhibiting specific binding activity, and examples thereof include MUC antibody and the like.
- the specimen image used in the present invention is, as described above, an image of a cervical specimen stained with a MUC conjugate.
- the preparation method of the cervical sample is not particularly limited, and, for example, general slide preparation in cytology can be used.
- the method for staining the cervical specimen with MUC conjugate is not particularly limited, and, for example, general staining using binding of a target (for example, an antigen) to a conjugate (for example, an antibody) thereto Methods are available.
- sample image an image of a cervical sample stained with the MUC antibody is exemplified, but the present invention is not limited thereto, and has a binding property to MUC. It may be an image of a sample stained by the binding substance.
- a sample to be a target for which a cytology category is to be estimated according to the present invention is referred to as a "subject".
- Embodiment 1 An example of the estimation apparatus and estimation method of the present invention will be described with reference to the drawings.
- FIG. 1 is a block diagram showing an example of an estimation apparatus of the present embodiment.
- the estimation apparatus 1 includes a sample image input unit 11, a cell image extraction unit 12, a cell shape classification unit 13, and a cytology category estimation unit 14.
- the estimation device 1 may further include, for example, a storage unit 15 and an output unit 16.
- the storage unit 15 includes, for example, a processing information storage unit 151, a cell image extraction information storage unit 152, a cell type classification information storage unit 153, a cytology category estimation information storage unit 154, and the like.
- the information storage unit 151 stores, for example, processing information (for example, input information input to the estimation device 1, output information output from the estimation device 1, and information obtained by the estimation device 1), and a cell image
- the extraction information storage unit 152 stores cell image extraction information
- the cell type classification information storage unit 153 stores cell type classification information
- the cytology category estimation information storage unit 154 stores cytology category estimation information .
- the estimation device 1 is also referred to, for example, as an estimation system.
- the estimation apparatus 1 may be, for example, one estimation apparatus including the respective units, or the respective units may be an estimation apparatus connectable via a communication network.
- the communication network is not particularly limited, and may be a known communication network, and may be wired or wireless. Specifically, for example, the Internet, telephone, LAN (Local Area Network), WiFi (Wireless Fidelity) Etc.).
- the processing of each unit may be performed on the cloud.
- FIG. 2 illustrates a block diagram of a hardware configuration of the estimation device 1.
- the estimation device 1 includes, for example, a CPU (central processing unit) 101, a memory 102, a bus 103, an input device 104, a display 105, a communication device 106, a storage device 107, an imaging device 113, and the like.
- the respective units of the estimation device 1 are mutually connected via a bus 103 by respective interfaces (I / F).
- the CPU 101 is responsible for overall control of the estimation device 1.
- the program of the present invention and other programs are executed by the CPU 101, and reading and writing of various information are performed.
- the CPU 101 functions as the sample image input unit 11, the cell image extraction unit 12, the cell shape classification unit 13, and the cytology category estimation unit 14.
- the estimation apparatus 1 can be connected to a communication network by, for example, the communication device 106 connected to the bus 103, and can also be connected to an external device via the communication network.
- the external device is not particularly limited, and examples thereof include an imaging device such as a camera, a terminal such as a personal computer (PC), a tablet, and a smartphone.
- the connection method between the estimation device 1 and the external device is not particularly limited, and may be, for example, wired connection or wireless connection.
- the wired connection may be, for example, a cord connection or a cable connection for using a communication network.
- the wireless connection may be, for example, a connection using a communication network or a connection using wireless communication.
- the communication network is not particularly limited, and for example, a known communication network can be used and is the same as described above.
- the connection type between the estimation device 1 and the external device may be, for example, USB or the like.
- the memory 102 includes, for example, a main memory, and the main memory is also referred to as a main storage device.
- the main memory is, for example, a RAM (random access memory).
- the memory 102 further includes, for example, a ROM (read only memory).
- the storage device 107 is also called, for example, a so-called auxiliary storage device with respect to the main memory (main storage device).
- the storage device 107 includes, for example, a storage medium and a drive that reads and writes to the storage medium.
- the storage medium is not particularly limited, and may be, for example, a built-in type or an external type, HD (hard disk), FD (floppy (registered trademark) disk), CD-ROM, CD-R, CD-RW, MO, Examples of the drive include a DVD, a flash memory, and a memory card, and the drive is not particularly limited.
- the storage device 107 can also be exemplified by, for example, a hard disk drive (HDD) in which a storage medium and a drive are integrated.
- HDD hard disk drive
- the operation program 108 is stored in the storage device 107, and as described above, the memory 102 reads the operation program 108 from the storage device 107 when the CPU 101 is executed.
- the storage device 107 may be, for example, the above-mentioned processing information 109 (for example, the input information, the output information, and information obtained by the estimation device 1), the cell image extraction information (for example, the cell image extraction model 110).
- the cell shape classification information for example, the cell shape estimation model 111
- the cytology category estimation information for example, the cytology category estimation model 112
- the estimation device 1 may further include an imaging device 113.
- the imaging device 113 is, for example, a camera.
- the imaging device 113 can image the stained sample and input the image.
- the estimation device 1 may further include, for example, an input device 104 and a display 105.
- the input device 104 is, for example, a scanner that reads an image, a touch panel, or a keyboard.
- the display 105 is, for example, an LED display, a liquid crystal display or the like, and also serves as the output unit 16.
- the cell image extraction information is information for extracting stained cells from the sample image.
- Examples of the cell image extraction information include a cell image extraction model 110.
- the cell image extraction model 110 is, for example, a model generated by learning from a stained cell image (for example, a cut out stained cell image) in a learning staining sample image for learning of the cervix accumulated in a database.
- a cervical specimen is stained with the MUC conjugate
- the resulting stained specimen includes cells that are stained by binding to the MUC conjugate and cells that are not stained because the MUC conjugate is not bound. And mixed.
- the learning data may further include, for example, a non-stained cell image in the stained sample image for learning, in which case the stained cell image and the non-stained cell image are learned as learning data, A model may be generated that can discriminate between non-stained cells.
- the learning may be, for example, any of AI, machine learning, deep learning, and the like.
- the machine learning can use, for example, SVM (Support Vector Machine) or the like, and the deep learning can use, for example, CNN (Convolutional Neural Network) or the like.
- the learning in the present invention is, for example, the same as the following.
- the cell morphology classification information is, as described above, information in which each of two or more morphology categories is associated with cell morphology information.
- the number of the morphology categories and the cell morphology information of each of the morphology categories are not particularly limited, and can be set arbitrarily.
- the present invention is a technique based on the fact that the compositional pattern of the morphology of the stained cells is different depending on the condition of the cervix as described above. For this reason, it can be said that, for example, as the number of morphological categories is relatively large, it is possible to estimate cytology categories with higher accuracy.
- the lower limit is 2 or more, for example, 5 or more and 20 or more are preferable, and the upper limit of the number of the form categories is not particularly limited, and for example, 100 or less and 30 or less is preferable.
- the cell morphology classification information may be stored in a database, for example, or may be a cell morphology estimation model 111.
- the cell shape estimation model 111 is, for example, a model for estimating a corresponding morphology category for cells in a cell image, and for example, from a learning stained cell image of the cervix corresponding to each of the morphology categories accumulated in the database, It can be generated by learning.
- the staining cell image for learning for example, a staining cell image of a characteristic form in each cytology category, a staining cell image of a form common to cytology categories, and the like can be used.
- the type of the cell shape information is not particularly limited, and examples thereof include items as shown in Table 2 below, and as shown in Table 3 below, can be linked to each of the shape categories.
- the cytology category estimation information is information in which each of two or more cytology categories is associated with a composition pattern of a cervical morphological category belonging to each of the cytology categories.
- the composition pattern of the morphology is different depending on the condition of the cervix. Therefore, for example, for each cytology category, the composition pattern of the morphological category of each stained cell is set in advance, and the corresponding cytology category is estimated from the composition pattern of the sample image of the subject. it can.
- the cytology category estimation information may be stored in, for example, a database, or may be a cytology category estimation model 112.
- the cytology category estimation model 112 is, for example, a model for estimating a corresponding cytology category with respect to the composition pattern of cells of the sample image, and the stained cells in the cervix corresponding to the respective cytology categories accumulated in the database. It can be generated by learning from composition patterns of the morphological category.
- the sample image input unit 11 inputs a sample image of the stained sample.
- the input of the sample image may be performed, for example, by imaging a slide of the stained sample, or reading of an image obtained by photographing the slide.
- the magnification of the sample image to be input is not particularly limited, and can be appropriately set or changed according to, for example, the magnification of the microscope at the time of photographing the slide.
- the cell image extraction unit 12 extracts a cell image of stained cells from the sample image.
- the method for extracting stained cells from the sample image is not particularly limited, and can be extracted using, for example, the cell image extraction model 110 as described above.
- cell images may be extracted for a large number of detectable stained cells contained in the sample image, preferably all detectable stained cells. As described above, since approximately 100,000 cells are usually present in a stained sample, it is impossible to confirm all stained cells by human judgment of the cytologist. . However, according to the estimation apparatus of the present invention, for example, since automatic analysis using an image is possible, it is also possible to extract an image of stained cells for all detectable stained cells.
- the cell image extraction unit 12 may double as, for example, a counting unit that detects the number of cell images of stained cells extracted in the sample image, and the estimation device 1 further includes, for example, the counting unit. It is also good.
- the cell morphology classification unit 13 classifies the stained cells in the cell image into a corresponding morphology category based on the cell morphology classification information, and the stained cells in the cell image extracted from the specimen image for the specimen image. Determine the composition patterns of the morphological categories.
- the classification into the morphology category can be performed, for example, using the cell morphology estimation model 111 as described above.
- the way of expressing the composition pattern is not particularly limited, and may be, for example, the cell ratio or frequency distribution of cells belonging to each of the morphology categories.
- the cell shape classification unit 13 may also serve as, for example, a counting unit that detects the number of cell images classified into each of the shape categories, or the estimation device 1 may further include, for example, the counting unit. .
- the cytology category estimation unit 14 estimates a cytology category for the stained sample of the sample image from the composition pattern of the sample image based on the cytology category estimation information.
- the estimation of the cytology category can be performed using, for example, the cytology category estimation model 112 as described above.
- the estimation method of the present embodiment can be performed, for example, by the estimation device 1 of the present embodiment.
- the sample image input step is a step of inputting a sample image of the stained sample, and can be executed by the sample image input unit 11 of the estimation device 1.
- the cell image extraction step is a step of extracting a cell image of a stained cell from the sample image, and can be executed by the cell image extraction unit 12 of the estimation device 1.
- the cell morphology classification step classifies the stained cells in the cell image into a corresponding morphology category based on the cell morphology classification information, and the stained cells in the cell image extracted from the specimen image for the specimen image. Determining the composition pattern of the morphological category of This process can be performed by the cell shape classification unit 13 of the estimation device 1.
- the cytology category estimation step is a step of estimating a cytology category for the stained specimen of the specimen image from the composition pattern of the specimen image based on the cytology category estimation information. This process can be performed, for example, by the cytology category estimation unit 14 of the estimation device 1.
- Embodiment 1 In the estimation apparatus and estimation method of the present invention, an example of a mode of determining the appropriateness of the sample image will be described. In addition, the description of the said Embodiment 1 can be used for this Embodiment 2, unless it shows in particular.
- the estimation apparatus 1 may further include, for example, an appropriateness determination unit of the sample image.
- the estimation method of the present embodiment may further include, for example, a step of determining the appropriateness of the sample image, and this step can be performed by, for example, the aptitude determining unit.
- the aptitude determining unit and the aptitude determining step for example, detect a determination item for the sample image, and when the detection result satisfies the aptitude result, determine that the sample image in the cell image extracting unit is suitable.
- the sample image of the stained sample is used. For example, if there is a problem in the stained sample itself or there is a problem in the sample image, it may lead to an erroneous estimation. Therefore, the erroneous estimation can be suppressed by judging the appropriateness of the sample image before extracting the stained cells.
- the determination item is, for example, the presence or absence of blurring of the detected image as an item for determining the appropriateness of the detected image itself. Then, for example, when blurring is detected from the detection image, the detection image is determined to be inappropriate, the cell image extraction process by the cell image extraction unit is stopped, and blurring is not detected from the detection image. In this case, it is possible to determine that the detected image is appropriate, and to execute the cell image extraction process by the cell image extraction unit.
- the determination items include, for example, the number of cells, the degree of blood contamination, the degree of inflammation, and the like in the sample image as items for determining the appropriateness of the stained sample itself. Then, for example, when it is detected that the number of cells is less than the threshold, the degree of blood contamination exceeds the threshold, the degree of inflammation exceeds the threshold, or the like in the detection image, the specimen image of the stained specimen and the stained specimen Is judged to be inappropriate, and the cell image extraction step by the cell image extraction unit is discontinued, while the number of cells is above the threshold and the degree of blood contamination is below the threshold in the detection image.
- the threshold value is equal to or less than a threshold value
- the determination item determines that the sample image is inappropriate when, for example, any one item becomes inappropriate.
- the threshold value of the determination item is not particularly limited, and the setting can be changed depending on, for example, whether or not the determination is severe.
- the number of cells can be determined from, for example, the number of cells in a predetermined area
- the degree of blood contamination can be determined, for example, the ratio of blood in a predetermined area
- the degree of inflammation can be determined from the appearance rate of neutrophils, for example.
- the determination of the aptitude can also use an aptitude determination model.
- the aptitude determination model can be generated, for example, by using, as learning data, an inappropriate sample image accumulated in a database and a sample image of aptitude for each determination item.
- the sample image input step is a step of inputting a sample image of the stained sample.
- the magnification of the sample image is not particularly limited. For example, when the actual size of the slide is 1 ⁇ , an image of about 100 ⁇ can be exemplified.
- the suitability determination step includes, for example, the steps of A2-1 to A2-6.
- the determination item is detected for the input sample image (A2-1).
- the determination items are, for example, the presence or absence of blurring of an image, the number of cells, the degree of blood contamination, the degree of inflammation, etc., and these are detected from the sample image.
- the order of detection of each of these determination items is not particularly limited.
- an aptitude determination is performed for each determination item of the sample image.
- the determination of the aptitude for example, as described above, the determination of the aptitude of the sample image itself and the determination of the aptitude of the stained sample itself are included.
- the order of the determination is not particularly limited, and any may be preceded. For example, when the sample image itself is inappropriate, for example, it is difficult to carry out the subsequent process itself, so it is preferable to determine the suitability of the stained sample itself after determining the suitability of the sample image.
- the presence or absence of blurring of the image is detected with respect to the sample image (A2-2), and when blurring occurs in the image (YES), the sample image itself is determined to be inappropriate and the process ends ( END).
- the sample image is determined to be inappropriate and the process ends ( END).
- the number of cells contained in the sample image is further detected (A2-3), and when the number of cells is small (YES), the stained sample is regarded as inappropriate.
- the number of cells is not small, that is, sufficient (NO)
- blood contamination is detected in the sample image (A2-4), and if blood contamination is significant (YES), the stained specimen itself is Judge as inappropriate and finish (END).
- A3 Cell Image Extraction Step
- an image (image area) of stained cells stained with the MUC conjugate is extracted from the sample image determined to be appropriate in the (A2-6) step.
- the extraction of the image includes, for example, the specification of the image area of the stained cell in the sample image, and the clipping of the specified image area.
- FIG. 5 is a schematic view of the extraction of the stained cells from the sample image
- FIG. 5 (A) shows a part of the sample image
- FIG. 5 (B) is each FIG. 5 (A). Shows an image of stained cells contained in As shown in FIG. 5 (A), the image areas of the stained cells are identified (image areas 1, 2, 3 in FIG. 5 (A)) for the sample image in which a plurality of stained areas exist (FIG. 5 (B)) As shown in), the image areas 1, 2, 3 are cut out as a stained image of the stained cells.
- the extraction of the image of the stained cells from the sample image may be performed, for example, by conventional image processing, or the cell image extraction model may be used.
- FIG. 6 a flowchart of FIG. 6 is shown as an example of extraction of a cell image of the stained cell from the sample image.
- the sample image determined to be appropriate in the step (A2) is input (A3-1), and a binarization process is performed on the sample image so that the stained area is white and the unstained area is black ( A3-2).
- the binarization process can use, for example, a cell image extraction model 110.
- regions including overlapping and / or adjacent white are collectively labeled as one cell (A3-3).
- A3-4 From the label (region labeled as one cell) in the sample image, a region having the largest number of stained pixels (stained pixels) is further selected (A3-4), and cut out as an image (A3-5).
- A3-5 Specifically, for example, as shown in the schematic diagram of FIG.
- the area Y with the largest number of stained pixels is selected, and the area Y is selected. Cut out as an image.
- the magnification of the clipped image is not particularly limited. For example, when the actual size of the slide is 1 ⁇ , an image of about 400 ⁇ can be exemplified. In the case where a plurality of labels are present in one sample image, the selection and clipping of the region having the largest number of stained pixels are repeated for each label.
- the selection and cutting out (A3-3- to A3-5) of the region having the largest number of stained pixels from the label can be performed, for example, by a conventional method.
- An example of the selection and segmentation is shown in FIG.
- the right figure of FIG. 8 is the same as FIG. 7, and the image area labeled as one cell is a boundary box X formed with height h in the Y axis direction and width w in the X axis direction. It is the figure enclosed.
- Y 'in the boundary box X is a search rectangle Y' indicating a region for counting the stained pixels in a flowchart to be described later.
- the size of the search rectangle Y ′ is arbitrary, and can be represented by a height b (b ⁇ h) in the Y-axis direction and a width a (a ⁇ w) in the X-axis direction.
- h is the height h of the boundary box X in the Y-axis direction
- w is the width w of the boundary box X in the X-axis direction.
- the variable j indicating the position in the Y-axis direction is 0, and the variable i indicating the position in the X-axis direction is 0.
- the left view of FIG. 8 is a flowchart of the selection and segmentation. For the image area of the boundary box X in FIG. 8, for example, as shown in the flowchart, the following steps are performed.
- A4) Count stained pixels in the search rectangle Y 'at the positions of the variables j and i (cnt).
- the count position of the stained image is moved to the next position (i + 1, j) or (i, j + 1), and the above (a4) And (a5) are similarly repeated (a8)
- the maximum value of the stained pixel count is set in cnt, and the maximum value in the search rectangle Y ′ is indicated in the coordinate position x, y.
- the position is set. Therefore, an image having a width a and a height b is cut out from the x and y points, and the process ends (END).
- selection and clipping are performed on the right diagram of FIG. 8, for example, the region Y in the diagram of FIG. 7 is selected and clipped.
- a cell image of stained cells extracted in the step (A3) may be counted.
- the expression of MUC is not confirmed in normal epithelium, but is specifically expressed in cervical intraepithelial neoplasia. Therefore, when the cervical specimen is normal, stained cells may not be detected even by the staining with the MUC conjugate. Therefore, for example, the cell image of the stained cells extracted for the sample image is counted (A4), and when the number of stained cells is small (YES), the sample of the sample image is estimated to be normal. May end (END).
- a threshold value of the number of stained cells per slide is set, and when it is less than the threshold value, it can be estimated to be normal.
- the slide in the cytology has 100,000 cells per sheet, so that the threshold of normality or not is, for example, a ratio of about 1% of the stained cells or the stained cells
- the number can be set to about 1,000.
- it can be estimated to be normal ie, the cytology category of "NILM".
- NO when the number of the stained cells is not small (NO), for example, when the number is larger than the threshold value, it is not possible to deduce that it is normal, and the process proceeds to the next cell shape classification step.
- the cell form classification step classifies the stained cells in the extracted cell image into a corresponding form category based on the cell form classification information (A5), and the sample image in the form It is a process of determining the composition pattern of the morphological category of stained cells in the cell image extracted from the specimen image.
- the cell shape classification step can be performed using, for example, the cell shape estimation model 111 as described above.
- FIG. 9 shows a schematic view of the classification of the stained cells.
- each stained cell is classified into A category based on the cell morphology classification information, It can be classified into 4 types of B category, C category and D category.
- This classification can use, for example, the cell shape estimation model 111 described above.
- Cytology category estimation step The cytology category estimation step estimates the cytology category for the stained sample of the sample image from the composition pattern of the sample image based on the cytology category estimation information ( A6). This estimation can use, for example, the cytology category estimation model 112 described above.
- the stained specimen contains various forms of stained cells stained with the MUC conjugate.
- the composition patterns of the various forms of stained cells are different depending on the condition of the cervix. That is, they are significantly different for each sample group belonging to each cytology category of cytology. Therefore, for example, a composition pattern of the subject and a composition pattern in each category of cytology are compared, and a cytology category similar to the composition pattern of the subject is a cytology category of the subject. It is possible to estimate.
- the stained cells in the sample image of the subject are category A 20%, category B 40%, category C 10%, category D 30% It is assumed that the composition pattern is classified into the ratio of.
- the categories NILM, LSIL, HSIL and SCC respectively have characteristic composition patterns as exemplified in FIG. 10, and the ratios of categories A to D, for example.
- the cytology category can be estimated from the composition pattern of the sample image by using the composition pattern of each cytology category as an estimation criterion. That is, for example, a composition pattern having a ratio of Category A 20%, Category B 40%, Category C 10%, and Category D 30% can be estimated as NILM.
- the present embodiment describes generation of each illustrated model.
- the present invention is not limited to this example.
- the cell image extraction model 110 can be generated by learning, for example, as described above. In the sample image, discrimination between a stained region and a non-stained region can be performed, for example, by setting the target staining color as positive and the other colors as negative. In the learning, for example, image data of a cell stained in a target color, data that the image data linked thereto is a positive example, image data of a cell stained in any other color, and It may be learned using data or the like that the image data linked thereto is a negative example.
- a specific example of generation of the cell image extraction model 110 is shown below. First, a slide of a cervical specimen stained with a MUC conjugate is imaged, and an image of stained cells is cut out from the imaged slide image and accumulated in a stained cell database. Then, the accumulated stained cell image is input as learning data to the model generation apparatus, and learning is performed, whereby a model can be generated which selects and extracts an image area including the stained cells.
- the sample image accumulated in the stained cell database is processed to confirm whether or not the extraction of the stained cell image is correct. Can correct the extraction error and further learn to generate the cell image extraction model 110 in which the extraction accuracy is further improved.
- the cell shape estimation model 111 can be generated by learning, for example, as described above. In the learning, for example, image data of cells corresponding to each morphology category may be used for learning.
- any plural form categories having different forms are determined (B1-1), and each of the form categories is associated with a stained cell image belonging thereto to classify the forms. Accumulate in database 1 Then, the form category stored and the stained cell image belonging thereto are input as learning data to the model generation apparatus as learning data (B1-2), and the stained cell images to be classified are learned by learning (B1-3), A model 111 can be generated that determines (estimates) which form category it falls under.
- the cell image of the stained cells stored in the shape classification database 2 is classified (B1-4), and whether the classification is correct or not is confirmed. If incorrect, perform processing to correct classification errors (B1-5), and further perform learning (B1-6) to generate a cell shape estimation model 111 with further improved classification accuracy. You can also.
- the database 1 used in the first learning and the database 2 used in the confirmation of the classification error are separately shown, but both may be the same database.
- Cytology category estimation model 112 The cytology category estimation model 112 can be generated by learning, for example, as described above. In the learning, for example, a composition pattern of a sample corresponding to each cytology category may be used to learn.
- a specific example of generation of the cytology category estimation model 112 is shown below.
- a slide of the cervix belonging to each cytology category stained with a MUC conjugate is imaged, and from the imaged slide image, for example, using the cell shape estimation model 111, the composition pattern of the cells belonging to each of the above morphology categories Are linked to the cytology category and accumulated in the composition pattern database.
- the composition pattern of the sample image to be classified corresponds to any cytology category by inputting the accumulated composition pattern and the cytology category as learning data to the model generation apparatus as learning data.
- a model 112 can be generated to estimate the
- the cytology category is classified for the composition pattern stored in the composition pattern database, and it is determined whether the classification is correct or not.
- a cytology category estimation model 112 with a further improved classification accuracy can be generated by performing processing for correcting classification errors and further performing learning.
- the cytology category estimation model 112 may be, for example, a cluster model by cluster analysis of the composition pattern and the cytology category.
- a program according to Embodiment 5 of the present invention is a program that can execute the estimation method of the present invention on a computer.
- the program of the present embodiment may be recorded on, for example, a computer readable recording medium.
- the recording medium is not particularly limited, and examples thereof include the above-described storage medium and the like.
- cytopathic classification estimation can be easily performed on cervical specimens, for example, before a pathologist diagnoses the morbidity of cervical cancer.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Immunology (AREA)
- Chemical & Material Sciences (AREA)
- Urology & Nephrology (AREA)
- Hematology (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- General Physics & Mathematics (AREA)
- Analytical Chemistry (AREA)
- Food Science & Technology (AREA)
- Medicinal Chemistry (AREA)
- Physics & Mathematics (AREA)
- Biotechnology (AREA)
- Microbiology (AREA)
- Cell Biology (AREA)
- Hospice & Palliative Care (AREA)
- Oncology (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Image Analysis (AREA)
Abstract
La présente invention concerne un nouveau système avec lequel il est possible d'estimer aisément une classification de cytodiagnostic pour un spécimen cervical avant un diagnostic par un pathologiste. Ce dispositif d'estimation d'une catégorie de cytodiagnostic comprend : une unité d'entrée d'image d'échantillon 11 qui entre une image d'échantillon d'un échantillon coloré, l'échantillon coloré étant un échantillon de col de l'utérus qui est coloré par un conjugué de MUC ; une unité d'extraction d'image de cellule 12 qui extrait une image de cellule colorée de l'image d'échantillon ; une unité de classification de forme de cellule 13 qui classe la cellule colorée dans l'image de cellule dans une catégorie de forme correspondante sur la base d'informations de classification de forme de cellule, et détermine, pour l'image d'échantillon, un motif organisationnel de la catégorie de forme de la cellule colorée dans l'image de cellule extraite de l'image d'échantillon ; et une unité d'estimation de catégorie de cytodiagnostic 14 qui estime une catégorie de cytodiagnostic pour l'échantillon coloré dans l'image d'échantillon à partir du motif organisationnel de l'image d'échantillon sur la base d'informations d'estimation de catégorie de cytodiagnostic.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017161944 | 2017-08-25 | ||
JP2017-161944 | 2017-08-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019039595A1 true WO2019039595A1 (fr) | 2019-02-28 |
Family
ID=65438931
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/031409 WO2019039595A1 (fr) | 2017-08-25 | 2018-08-24 | Dispositif et procédé d'estimation de catégorie de cytodiagnostic du col de l'utérus |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019039595A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111861975A (zh) * | 2019-04-26 | 2020-10-30 | 学校法人顺天堂 | 辅助疾病分析的方法、装置及计算机程序、以及训练计算机算法的方法、装置及程序 |
US11830188B2 (en) | 2018-05-10 | 2023-11-28 | Sysmex Corporation | Image analysis method, apparatus, non-transitory computer readable medium, and deep learning algorithm generation method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005043166A1 (fr) * | 2003-10-30 | 2005-05-12 | Sysmex Corporation | Diagnostic de cancer de glande uterine et procede de detection de cellules cancereuses de glande uterine |
JP2007516428A (ja) * | 2003-06-12 | 2007-06-21 | サイティック コーポレイション | 分散プロット分布を用いてスライドの染色品質を決定するシステム |
JP2013020212A (ja) * | 2011-07-14 | 2013-01-31 | Canon Inc | 画像処理装置、撮像システム、画像処理システム |
JP2013541767A (ja) * | 2010-09-16 | 2013-11-14 | ユニバーシティ・オブ・カンザス | セルブロック調製物のデジタル評価のためのシステム及び方法 |
JP2016184224A (ja) * | 2015-03-25 | 2016-10-20 | 株式会社日立ハイテクノロジーズ | 細胞診断支援装置、細胞診断支援方法、遠隔診断支援システム、サービス提供システム、及び画像処理方法 |
JP2017029083A (ja) * | 2015-08-03 | 2017-02-09 | 国立大学法人 東京大学 | ウイルス性婦人科癌の診断方法 |
-
2018
- 2018-08-24 WO PCT/JP2018/031409 patent/WO2019039595A1/fr active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007516428A (ja) * | 2003-06-12 | 2007-06-21 | サイティック コーポレイション | 分散プロット分布を用いてスライドの染色品質を決定するシステム |
WO2005043166A1 (fr) * | 2003-10-30 | 2005-05-12 | Sysmex Corporation | Diagnostic de cancer de glande uterine et procede de detection de cellules cancereuses de glande uterine |
JP2013541767A (ja) * | 2010-09-16 | 2013-11-14 | ユニバーシティ・オブ・カンザス | セルブロック調製物のデジタル評価のためのシステム及び方法 |
JP2013020212A (ja) * | 2011-07-14 | 2013-01-31 | Canon Inc | 画像処理装置、撮像システム、画像処理システム |
JP2016184224A (ja) * | 2015-03-25 | 2016-10-20 | 株式会社日立ハイテクノロジーズ | 細胞診断支援装置、細胞診断支援方法、遠隔診断支援システム、サービス提供システム、及び画像処理方法 |
JP2017029083A (ja) * | 2015-08-03 | 2017-02-09 | 国立大学法人 東京大学 | ウイルス性婦人科癌の診断方法 |
Non-Patent Citations (2)
Title |
---|
INOUE, YOSHIKI: "Cervical Cancer", JAPANESE JOURNAL OF CLINICAL MEDICINE, vol. 67, no. 5, 2009, pages 185 - 190 * |
TSUJI, TAKAHIRO ET AL.: "Malignant tumor, cervical gland lesions - Current status and problems of handling boundary lesions", OBSTETRICAL AND GYNECOLOGICAL THERAPY, vol. 100, no. 1, 2010, pages 109 - 114 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11830188B2 (en) | 2018-05-10 | 2023-11-28 | Sysmex Corporation | Image analysis method, apparatus, non-transitory computer readable medium, and deep learning algorithm generation method |
US12272061B2 (en) | 2018-05-10 | 2025-04-08 | Juntendo Educational Foundation | Image analysis method, apparatus, non-transitory computer readable medium, and deep learning algorithm generation method |
CN111861975A (zh) * | 2019-04-26 | 2020-10-30 | 学校法人顺天堂 | 辅助疾病分析的方法、装置及计算机程序、以及训练计算机算法的方法、装置及程序 |
JP2020180954A (ja) * | 2019-04-26 | 2020-11-05 | 学校法人順天堂 | 疾患解析を支援する方法、装置、及びコンピュータプログラム、並びにコンピュータアルゴリズムを訓練する方法、装置、及びプログラム |
JP7381003B2 (ja) | 2019-04-26 | 2023-11-15 | 学校法人順天堂 | 疾患解析を支援する方法、装置、及びコンピュータプログラム、並びにコンピュータアルゴリズムを訓練する方法、装置、及びプログラム |
US11978198B2 (en) | 2019-04-26 | 2024-05-07 | Juntendo Educational Foundation | Method, apparatus, and computer program for supporting disease analysis, and method, apparatus, and program for training computer algorithm |
JP7668502B2 (ja) | 2019-04-26 | 2025-04-25 | 学校法人順天堂 | 疾患解析を支援する方法、装置、及びコンピュータプログラム、並びにコンピュータアルゴリズムを訓練する方法、装置、及びプログラム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111986150B (zh) | 一种数字病理图像的交互式标注精细化方法 | |
Veta et al. | Assessment of algorithms for mitosis detection in breast cancer histopathology images | |
Ghasemian et al. | An efficient method for automatic morphological abnormality detection from human sperm images | |
US8199997B2 (en) | Feature dependent extended depth of focusing on semi-transparent biological specimens | |
TWI379248B (en) | Methods and systems for processing biological specimens utilizing multiple wavelengths | |
Sertel et al. | Computer-aided prognosis of neuroblastoma: Detection of mitosis and karyorrhexis cells in digitized histological images | |
CN112215800B (zh) | 基于机器学习的重叠染色体识别和分割方法 | |
CN113658174B (zh) | 基于深度学习和图像处理算法的微核组学图像检测方法 | |
US11455724B1 (en) | Systems and methods to process electronic images to adjust attributes of the electronic images | |
US12131465B2 (en) | User-assisted iteration of cell image segmentation | |
CN113130049A (zh) | 基于云服务的智能病理图像诊断系统 | |
JP4864709B2 (ja) | 分散プロット分布を用いてスライドの染色品質を決定するシステム | |
Sivakamasundari et al. | Proposal of a Content Based retinal Image Retrieval system using Kirsch template based edge detection | |
WO2019039595A1 (fr) | Dispositif et procédé d'estimation de catégorie de cytodiagnostic du col de l'utérus | |
CN112950585A (zh) | 基于液基薄层细胞检测技术tct的宫颈癌细胞智能检测方法 | |
US8538122B2 (en) | Localization of a valid area of a blood smear | |
JP4897488B2 (ja) | 分散プロット分布を用いてスライドを分類するシステム | |
Shirazi et al. | Automated pathology image analysis | |
Arya et al. | Clustering techniques on pap-smear images for the detection of cervical cancer | |
US20100111398A1 (en) | Method and system for detection of oral sub-mucous fibrosis using microscopic image analysis of oral biopsy samples | |
CN111882521A (zh) | 一种细胞涂片的图像处理方法 | |
CN114511559B (zh) | 染色鼻息肉病理切片质量多维评价方法、系统及介质 | |
CN111062909A (zh) | 乳腺肿块良恶性判断方法及设备 | |
CA3216960A1 (fr) | Systemes et procedes de traitement d'images electroniques pour ajuster des colorations dans les images electroniques | |
Forsberg et al. | Evaluating cell nuclei segmentation for use on whole-slide images in lung cytology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18848180 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18848180 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |