WO2018017097A1 - Procédés informatisés de reconnaissance de formes cellulaires - Google Patents
Procédés informatisés de reconnaissance de formes cellulaires Download PDFInfo
- Publication number
- WO2018017097A1 WO2018017097A1 PCT/US2016/043318 US2016043318W WO2018017097A1 WO 2018017097 A1 WO2018017097 A1 WO 2018017097A1 US 2016043318 W US2016043318 W US 2016043318W WO 2018017097 A1 WO2018017097 A1 WO 2018017097A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- classifier
- cell
- interactive
- algorithm
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000003909 pattern recognition Methods 0.000 title claims abstract description 32
- 230000002452 interceptive effect Effects 0.000 claims abstract description 27
- 238000004422 calculation algorithm Methods 0.000 claims description 23
- 238000000605 extraction Methods 0.000 claims description 16
- 238000004458 analytical method Methods 0.000 claims description 13
- 238000007635 classification algorithm Methods 0.000 claims description 9
- 238000012512 characterization method Methods 0.000 claims description 7
- 238000010186 staining Methods 0.000 claims description 7
- 238000013528 artificial neural network Methods 0.000 claims description 2
- 238000003066 decision tree Methods 0.000 claims description 2
- 230000014509 gene expression Effects 0.000 claims description 2
- 238000012706 support-vector machine Methods 0.000 claims description 2
- 238000001000 micrograph Methods 0.000 abstract description 6
- 210000004027 cell Anatomy 0.000 description 81
- 210000001519 tissue Anatomy 0.000 description 26
- 210000004881 tumor cell Anatomy 0.000 description 11
- WZUVPPKBWHMQCE-UHFFFAOYSA-N Haematoxylin Chemical compound C12=CC(O)=C(O)C=C2CC2(O)C1C1=CC=C(O)C(O)=C1OC2 WZUVPPKBWHMQCE-UHFFFAOYSA-N 0.000 description 6
- 210000004882 non-tumor cell Anatomy 0.000 description 6
- 210000000481 breast Anatomy 0.000 description 5
- 206010028980 Neoplasm Diseases 0.000 description 4
- 238000013459 approach Methods 0.000 description 4
- 238000004040 coloring Methods 0.000 description 4
- 102000003998 progesterone receptors Human genes 0.000 description 4
- 108090000468 progesterone receptors Proteins 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000010422 painting Methods 0.000 description 2
- 238000011002 quantification Methods 0.000 description 2
- 206010006187 Breast cancer Diseases 0.000 description 1
- 208000026310 Breast neoplasm Diseases 0.000 description 1
- 208000009458 Carcinoma in Situ Diseases 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 210000000805 cytoplasm Anatomy 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- YQGOJNYOYNNSMM-UHFFFAOYSA-N eosin Chemical compound [Na+].OC(=O)C1=CC=CC=C1C1=C2C=C(Br)C(=O)C(Br)=C2OC2=C(Br)C(O)=C(Br)C=C21 YQGOJNYOYNNSMM-UHFFFAOYSA-N 0.000 description 1
- 230000002496 gastric effect Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 201000004933 in situ carcinoma Diseases 0.000 description 1
- 235000019239 indanthrene blue RS Nutrition 0.000 description 1
- UHOKSCJSTAHBSO-UHFFFAOYSA-N indanthrone blue Chemical compound C1=CC=C2C(=O)C3=CC=C4NC5=C6C(=O)C7=CC=CC=C7C(=O)C6=CC=C5NC4=C3C(=O)C2=C1 UHOKSCJSTAHBSO-UHFFFAOYSA-N 0.000 description 1
- 239000012528 membrane Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the claimed invention relates generally to systems and methods for computerized medical imaging and analysis; and more particularly, to systems and methods for cell-based pattern recognition and machine learning as applied to microscopy images from tissue sections.
- a pathologist can outline the regions-of-analysis that only include cells of interest, but this can be very time consuming and impractical when analyzing entire tissue sections.
- An automated pattern recognition tool is needed that identifies cells in tissue that are of the type of interest.
- Pattern recognition tools that use general-purpose pixel-based feature sets can be used in a wide variety of applications. However these provide in many cases, only a sub-optimal performance for any particular application.
- a method for cell based pattern recognition is incorporated into a computerized platform, the method includes: using a computer coupled to a database containing a plurality of images of biological tissue sections, calling up one or more first images of said plurality of digital images for analysis; for said first images: executing a feature extraction algorithm, said feature extraction algorithm configured to detect cells within said first images and analyze one or more cell features thereof; and performing an interactive classifier learning algorithm, said interactive classifier learning algorithm configured to create an application-specific classifier based on interactive user annotations of said cell features of the first images; and for one or more second images of said plurality of digital images: executing the feature extraction algorithm to detect cells within the second images and analyze one or more cell features thereof; and executing an automated classification algorithm, said automated classification algorithm being configured to characterize the cells and cell features of the second images using the application-specific classifier.
- FIG. l illustrates a method for cell-based pattern recognition.
- FIG. 2 shows the interactive classifier-learning process according to the method illustrated in FIG.1.
- FIG. 3 shows the automated classification process according to the method illustrated in FIG.1.
- FIG. 4 is an image showing cells detected by the application-specific feature extraction program in accordance with one example including the identification of tumor cells in breast tissue when using progesterone receptors staining.
- FIG. 5 is an image showing cells of different cell types during the interactive classifier-learning process in accordance with the referenced example.
- FIG. 6 is an image showing cells classified by the automated classification process in the referenced example.
- a key to building a high-performance pattern recognition tool for microscopy images of tissue sections is to customize the feature extraction to each particular application and provide the classification based on cell specific features at the cell-level.
- computerized pattern recognition tools are based on a feature extraction process, an interactive classifier-learning process, and automated classification process. Each of these individual portions collectively define a method for cell-based pattern recognition, which is an improvement over conventional pattern recognition tools.
- a device such as a computer being programmed to acquire microscopy images and process the images in accordance with the method for cell-based pattern recognition described herein can be referred to as a system configured for cell-based pattern recognition.
- the feature extraction process includes the detection of cells and the calculation of cell features that will be subsequently used for the classification of the cells.
- the detection of cells needs to be application- specific to: the tissue type, for example, round cells in breast tissue vs. elongated cells in gastrointestinal tissue; the cell compartments being stained, for example, nucleus, membrane and cytoplasm; and the staining chromogen, for example Hematoxylin, Eosin, DAB.
- characterization of the cell morphology e.g. area of the nucleus
- characterization of the staining e.g. mean optical density of DAB staining on the nucleus
- characterization of the cell neighborhood e.g. nuclei profile surface density, which means the percentage of the area in the neighborhood of a cell that is covered with nuclei).
- the classifier-learning process is an interactive program that creates a classifier from examples provided by a user.
- the classifier uses the cell features and provides a classification at the cell-level.
- the user defines the number of different cell types of interest and then identifies examples of cells that are representative for those cell types.
- the program trains a classifier based on those examples using supervised machine learning techniques and displays the cell classification results based on the current classifier.
- Pattern recognition at the cell-level with pre-calculated cell features is very fast. This process, where a user provides the examples, allows the classifier to be updated while the updated classification results are displayed can be very responsive.
- the pattern recognition tool can use any classification algorithm that supports supervised learning.
- Standard classification algorithms and their derivatives or a combination of them can be used, which include, but are not limited to: Bayes classifier, k-nearest neighbor, maximum entropy classifier, Markov models, support vector machines, gene expression programming, neural networks and decision trees.
- FIG. 1 shows a method for cell-based pattern recognition in accordance with an embodiment.
- the method includes an application-specific feature extraction, wherein cell-based features are detected.
- an interactive learning classifier is developed as a practitioner annotates a particular specimen image to identify and classify various cell-based features as pertinent or non-pertinent.
- the system can now run an automated classification based on the specific application.
- the interactive learning classifier can saved to memory and stored for future use. Additionally, the classifier can be called up for further identification and tuning by a practitioner, for example to train the system for an application using a unique set of tissue specimen images.
- FIG.2 shows the classifier-learning process from the loading of images to the creation of an application-specific classifier.
- the first step is the application-specific feature extraction that detects the cells and calculates their features, this can be an automated cell feature algorithm performed for one or more section images of a tissue sample volume.
- the second step is the classifier learning, an interactive process, where a user provides examples of the different cell types and the program creates an application-specific classifier based on those examples.
- the classification program provides a classification of the cells by applying the classifier on cell features. Note that the feature extraction program and the classifiers created by the classifier learning program are application-specific. The compatibility of the cells provided by the feature extraction program and the classifier are verified by the classification program.
- Fig. 3 shows the classification process from the loading of images in the system to the classified cells.
- the first step is the same as for the classifier learning, the application-specific feature extraction that detects the cells and calculates cell features.
- the second step is the classification of cells using the application-specific classifier created by the classifier learning program on cell features.
- a critical problem for a pattern recognition tool for microscopy images of tissue sections is that cells can look considerably different in tissue samples from different origins (e.g. breast tumor nuclei sizes in different patients). Therefore a calibration step is part of the classifier learning and the classification.
- the calibration method and parameters can be hard-coded (e.g. program measures the mean nuclei diameter of all nuclei in the entire tissue section and then uses it to normalize all nuclei size measurements used for the classification), assisted by a user (e.g. user outlines tumor nuclei in the tissue section and then the program measures the mean nuclei diameter of those tumor nuclei and uses it to normalize all nuclei size measurements used for the classification), and/or automatically determined by the classifier learning when using tissue sections from different origins and then automatically applied by the classification (e.g. classifier learning determines that the means of the tumor nuclei size distributions vary between tissue sections from different origins and a mean value normalization is applied).
- program measures the mean nuclei diameter of all nuclei in the entire tissue section and then uses it to normalize all nuclei size measurements used for the classification
- assisted by a user e.g. user outlines tumor nuclei in the tissue section and then the program measures the mean nuclei diameter of those tumor nuclei and uses it to normal
- one embodiment can include applying the systems and methods to build responsive image analysis tools as described in commonly owned and co-pending U.S. Serial No. 14/052,773, filed October 14, 2013, the contents of which are hereby incorporated by reference.
- the feature extraction can be part of a low-level image analysis program that is executed automatically by the system.
- the classifier-learning would be part of an interactive high-level image analysis program that would be operated by a user.
- the separation of the heavy processing feature extraction from the classifier-learning and a classification at the cell-level provide the basis for the implementation of a highly interactive and responsive classifier-learning program.
- the classification does not require any user interactions and could therefore be part of the low- level image analysis program. However given its dependency on the classifier, it would be better implemented as part of a high-level image analysis program that is either executed automatically by the system or by a user depending on the interactions required.
- the cell-based pattern recognition approach can be integrated with pixel-based and/or region-based pattern recognition approaches.
- the integration with pixel-based approaches is desirable when regions need to be included in the analysis, which are not part of the detected cells.
- the integration with region-based approaches is desirable when the region- level features are important for the classification of the cells (e.g. invasive tumor vs. carcinoma in situ).
- a region representation of the cells provides a convenient data structure for this integration.
- systems and methods are described using a simple application, the identification of tumor cells in breast tissue when using progesterone receptors staining.
- the slides are stained with DAB (brown) for the quantification of the progesterone receptors in the nuclei and stained with Hematoxylin (blue) to identify the nuclei. Note that in this application the nucleus is the only cell compartment that is stained.
- the application-specific feature extraction program is optimized to detect the nuclei based on the Hematoxylin and DAB staining and the specific morphology of the nuclei in breast tissue.
- the cell detection was already part of the tissue analysis application that provides the quantification of progesterone receptors in breast tissue. For the classification of the cells, only the calculation of additional cell features needed to be implemented.
- Fig. 4 shows the nuclei detected by the feature extraction program.
- the classifier learning program was set up for two different cell types, tumor cells and non-tumor cells.
- a user identifies representative examples of the nuclei belonging to tumor cells and non-tumor cells.
- the program provides an updated display of the classification results as more or updated examples are provided.
- a very simple gating classification algorithm was used for this illustration. Basically, the algorithm determines the significant features that allow distinguishing between the different cell types based on minimum and maximum thresholds and determines these thresholds.
- This interactive program is very responsive as the learning and classification is done at the cell-level (vs. the pixel-level) and the cell features are already pre-calculated.
- Fig. 5 shows the nuclei of the different cell types of interest during classifier- learning.
- the large light blue circle shows the actual position of the painting tool that allows the user to identify the nuclei of tumor cells.
- the dark blue coloring of a nucleus shows that the nucleus has been identified by the user as a nucleus of a tumor cell.
- the medium blue coloring of a nucleus shows that it has been classified by the current classifier as a nucleus of a tumor cell.
- the large light green circle shows the actual position of the painting tool that allows the user to identify the nuclei of non-tumor cells.
- the dark green coloring of a nucleus shows that the nucleus has been identified by the user as a nucleus of a non-tumor cell.
- the medium green coloring of a nucleus shows that it has been classified by the current classifier as a nucleus of a non-tumor cell.
- the classification program has been configured to only use the tumor cells for the tissue analysis.
- the classifier provided by this example actually used only a single cell feature, the nuclei profile surface density and determined a threshold of 25% to distinguish between tumor cells and non-tumor cells. Equivalent results using general-purpose pixel- based pattern recognition tools would have required more features and more complex classification algorithms.
- Fig. 6 shows the nuclei in blue that were classified as nuclei of tumor cells.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Radiology & Medical Imaging (AREA)
- Library & Information Science (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Quality & Reliability (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
L'invention concerne des systèmes et des procédés relatifs à un outil de reconnaissance de formes cellulaires pour des images de microscopie à partir de sections de tissus, les caractéristiques des cellules étant extraites et un classificateur étant construit conformément à une application particulière à l'aide d'un entraînement interactif lié à une plate-forme informatisée, le résultat est un classificateur spécifique à l'application qui traite en outre des images conformément à l'application spécifique, ce qui permet d'accorder un processus automatisé de reconnaissance de formes sur la base des cellules.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2016/043318 WO2018017097A1 (fr) | 2016-07-21 | 2016-07-21 | Procédés informatisés de reconnaissance de formes cellulaires |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2016/043318 WO2018017097A1 (fr) | 2016-07-21 | 2016-07-21 | Procédés informatisés de reconnaissance de formes cellulaires |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018017097A1 true WO2018017097A1 (fr) | 2018-01-25 |
Family
ID=60992332
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2016/043318 WO2018017097A1 (fr) | 2016-07-21 | 2016-07-21 | Procédés informatisés de reconnaissance de formes cellulaires |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018017097A1 (fr) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080292194A1 (en) * | 2005-04-27 | 2008-11-27 | Mark Schmidt | Method and System for Automatic Detection and Segmentation of Tumors and Associated Edema (Swelling) in Magnetic Resonance (Mri) Images |
US20150110368A1 (en) * | 2013-10-22 | 2015-04-23 | Eyenuk, Inc. | Systems and methods for processing retinal images for screening of diseases or abnormalities |
US20150317509A1 (en) * | 2002-09-13 | 2015-11-05 | Life Technologies Corporation | Interactive and Automated Tissue Image Analysis with Global Training Database and Variable-Abstraction Processing in Cytological Specimen Classification and Laser Capture Microdissection Applications |
-
2016
- 2016-07-21 WO PCT/US2016/043318 patent/WO2018017097A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150317509A1 (en) * | 2002-09-13 | 2015-11-05 | Life Technologies Corporation | Interactive and Automated Tissue Image Analysis with Global Training Database and Variable-Abstraction Processing in Cytological Specimen Classification and Laser Capture Microdissection Applications |
US20080292194A1 (en) * | 2005-04-27 | 2008-11-27 | Mark Schmidt | Method and System for Automatic Detection and Segmentation of Tumors and Associated Edema (Swelling) in Magnetic Resonance (Mri) Images |
US20150110368A1 (en) * | 2013-10-22 | 2015-04-23 | Eyenuk, Inc. | Systems and methods for processing retinal images for screening of diseases or abnormalities |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111448582B (zh) | 用于单通道全细胞分割的系统和方法 | |
WO2015069824A2 (fr) | Système de diagnostic et procédé d'analyse de tissus biologiques | |
US9424459B1 (en) | Computerized methods for cell-based pattern recognition | |
MoradiAmin et al. | Enhanced recognition of acute lymphoblastic leukemia cells in microscopic images based on feature reduction using principle component analysis | |
Bhattacharjee et al. | Detection of Acute Lymphoblastic Leukemia using watershed transformation technique | |
Tareef et al. | Automated three-stage nucleus and cytoplasm segmentation of overlapping cells | |
Kong et al. | Image analysis for automated assessment of grade of neuroblastic differentiation | |
McKenna et al. | Immunohistochemical analysis of breast tissue microarray images using contextual classifiers | |
Bhamare et al. | Automatic blood cell analysis by using digital image processing: a preliminary study | |
Paeng et al. | A unified framework for tumor proliferation score prediction in breast histopathology | |
Chang et al. | Multireference level set for the characterization of nuclear morphology in glioblastoma multiforme | |
He et al. | Multiphase level set model with local K-means energy for histology image segmentation | |
EP4379676A1 (fr) | Système de détection, appareil de détection, appareil d'apprentissage, procédé de détection, procédé d'apprentissage et programme | |
Boucheron et al. | Use of imperfectly segmented nuclei in the classification of histopathology images of breast cancer | |
WO2021188477A1 (fr) | Segmentation évolutive et de haute précision guidée par contexte de structures histologiques comprenant des conduits/glandes et une lumière, un groupe de conduits/glandes, et des noyaux individuels dans des images coulissantes entières d'échantillons de tissu à partir de plateformes d'imagerie cellulaires et sous-cellulaires à paramètres multiples spatiaux | |
Chelebian et al. | Self-Supervised Learning for Genetically Relevant Domain Identification in Morphological Images | |
WO2018017097A1 (fr) | Procédés informatisés de reconnaissance de formes cellulaires | |
Kost et al. | Training nuclei detection algorithms with simple annotations | |
Avenel et al. | Marked point processes with simple and complex shape objects for cell nuclei extraction from breast cancer H&E images | |
Sailem et al. | Discovery of rare phenotypes in cellular images using weakly supervised deep learning | |
Guerrero et al. | Improvements in lymphocytes detection using deep learning with a preprocessing stage | |
US12039720B2 (en) | Automatic estimation of tumor cellularity using a DPI AI platform | |
Sertel et al. | An image analysis approach for detecting malignant cells in digitized H&E-stained histology images of follicular lymphoma | |
Jin et al. | A random-forest random field approach for cellular image segmentation | |
CN114332037A (zh) | 一种自动分割胰腺癌病理切片中多类组织的方法及装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16909689 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16909689 Country of ref document: EP Kind code of ref document: A1 |