WO2003030073A1 - Mesure de qualite - Google Patents
Mesure de qualite Download PDFInfo
- Publication number
- WO2003030073A1 WO2003030073A1 PCT/DK2002/000660 DK0200660W WO03030073A1 WO 2003030073 A1 WO2003030073 A1 WO 2003030073A1 DK 0200660 W DK0200660 W DK 0200660W WO 03030073 A1 WO03030073 A1 WO 03030073A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- measure
- image
- quality
- fundus
- fundus image
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 82
- 230000007170 pathology Effects 0.000 claims abstract description 27
- 238000005286 illumination Methods 0.000 claims abstract description 24
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 21
- 239000000203 mixture Substances 0.000 claims abstract description 7
- 238000012216 screening Methods 0.000 claims description 19
- 210000001525 retina Anatomy 0.000 claims description 18
- 230000001575 pathological effect Effects 0.000 claims description 16
- 230000003902 lesion Effects 0.000 claims description 13
- 208000032544 Cicatrix Diseases 0.000 claims description 8
- 231100000241 scar Toxicity 0.000 claims description 8
- 230000037387 scars Effects 0.000 claims description 8
- 206010012689 Diabetic retinopathy Diseases 0.000 claims description 7
- 230000000694 effects Effects 0.000 claims description 7
- 210000003733 optic disk Anatomy 0.000 claims description 6
- 208000002177 Cataract Diseases 0.000 claims description 4
- 230000000873 masking effect Effects 0.000 claims description 4
- 230000009466 transformation Effects 0.000 claims description 4
- 210000004127 vitreous body Anatomy 0.000 claims description 4
- 206010002329 Aneurysm Diseases 0.000 claims description 3
- 239000006185 dispersion Substances 0.000 claims description 3
- 210000000416 exudates and transudate Anatomy 0.000 claims description 3
- 208000010412 Glaucoma Diseases 0.000 claims description 2
- 210000004087 cornea Anatomy 0.000 claims description 2
- 210000000695 crystalline len Anatomy 0.000 claims description 2
- 210000000720 eyelash Anatomy 0.000 claims description 2
- 239000007788 liquid Substances 0.000 claims description 2
- 210000001210 retinal vessel Anatomy 0.000 claims description 2
- 208000006069 Corneal Opacity Diseases 0.000 claims 1
- 206010030113 Oedema Diseases 0.000 claims 1
- 208000037111 Retinal Hemorrhage Diseases 0.000 claims 1
- 208000002367 Retinal Perforations Diseases 0.000 claims 1
- 206010038848 Retinal detachment Diseases 0.000 claims 1
- 208000034698 Vitreous haemorrhage Diseases 0.000 claims 1
- 208000034700 Vitreous opacities Diseases 0.000 claims 1
- 210000000744 eyelid Anatomy 0.000 claims 1
- 230000001497 fibrovascular Effects 0.000 claims 1
- 230000004264 retinal detachment Effects 0.000 claims 1
- 238000001514 detection method Methods 0.000 description 11
- 238000004458 analytical method Methods 0.000 description 8
- 201000010099 disease Diseases 0.000 description 7
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 7
- 210000001508 eye Anatomy 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 7
- 230000002207 retinal effect Effects 0.000 description 6
- 238000011282 treatment Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000017531 blood circulation Effects 0.000 description 4
- 238000010191 image analysis Methods 0.000 description 4
- 210000005166 vasculature Anatomy 0.000 description 4
- 201000004569 Blindness Diseases 0.000 description 3
- 208000032843 Hemorrhage Diseases 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 3
- 206010012601 diabetes mellitus Diseases 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000036541 health Effects 0.000 description 3
- 230000004807 localization Effects 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 230000009897 systematic effect Effects 0.000 description 3
- 208000009857 Microaneurysm Diseases 0.000 description 2
- 208000017442 Retinal disease Diseases 0.000 description 2
- 206010038923 Retinopathy Diseases 0.000 description 2
- 239000000872 buffer Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000006735 deficit Effects 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 210000001328 optic nerve Anatomy 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 230000000638 stimulation Effects 0.000 description 2
- 230000002459 sustained effect Effects 0.000 description 2
- 241001530105 Anax Species 0.000 description 1
- 206010012646 Diabetic blindness Diseases 0.000 description 1
- 101150038444 Ment gene Proteins 0.000 description 1
- 201000007737 Retinal degeneration Diseases 0.000 description 1
- 206010038862 Retinal exudates Diseases 0.000 description 1
- 238000005576 amination reaction Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 230000036770 blood supply Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007850 degeneration Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000012631 diagnostic technique Methods 0.000 description 1
- 235000021196 dietary intervention Nutrition 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 238000013399 early diagnosis Methods 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 238000000799 fluorescence microscopy Methods 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 210000002189 macula lutea Anatomy 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000008450 motivation Effects 0.000 description 1
- 208000015122 neurodegenerative disease Diseases 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000000649 photocoagulation Effects 0.000 description 1
- 108091008695 photoreceptors Proteins 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000004258 retinal degeneration Effects 0.000 description 1
- 230000004256 retinal image Effects 0.000 description 1
- 230000004286 retinal pathology Effects 0.000 description 1
- 230000004434 saccadic eye movement Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0025—Operational features thereof characterised by electronic signal processing, e.g. eye models
Definitions
- the present invention relates to a method for determining the quality of a fundus image, the use of said quality for determining pathologies and/or artefacts in the image, and methods for handling said image, as well as a system comprising algorithms performing the methods.
- Fundus image analysis presents several challenges, such as high image variability, the need for reliable processing in the face of nonideal imaging conditions and short computation deadlines. Large variability is observed between different patients - even if healthy, with the situation worsening when pathologies exist. For the same patient, variability is observed under differing imaging conditions and during the course of a treatment or simply a long period of time. Besides, fundus images are often characterized by having a limited quality, being subject to improper illumination, glare, fadeout, loss of focus and artifacts arising from reflection, refraction, and dispersion.
- Diabetes is the leading cause of blindness in working age adults. It is a disease that, among its many symptoms, includes a progressive impairment of the peripheral vascular system. These changes in the vasculature of the retina cause progressive vision impairment and eventually complete loss of sight. The tragedy of diabetic reti- nopathy is that in the vast majority of cases, blindness is preventable by early diagnosis and treatment, but screening programs that could provide early detection are not widespread.
- Image quality is an important parameter in automated fundus image analysis systems, as algorithms for estimating vessel geometry for detecting the optic nerve head and for lesion detection are often developed and validated using images of a reasonable quality. When images of poor quality are passed to the system, the algo- rithms may return erroneous results, which may have critical consequences, especially in an automatic screening scenario.
- the present inventors have been able to calculate a quality measure that apart from being used in selecting images for automatic screening also may provide an indica- tor for various pathologies and artefacts of the image, since it has been found, that there is a systematic correlation between the image quality and some pathologies in or about the eye.
- the present invention relates to an automatic method for quantifying and/or qualifying pathology indicators and/or artefacts of a fundus image or of a part of a fundus image, comprising
- the present invention relates to a quality measure as such, the use of said quality measure for example being in selecting images for automatic screening procedures. Accordingly, the present invention further relates to a method for quantifying the quality of a fundus image, comprising
- the quality measure may be used as a tool in a method for detecting structure and pathologies in a fundus image, such as a method for automatically detecting the presence or absence of a structure and/or a pathological condition in parts of a fun- dus image, comprising
- the gradability measure relates to an overall measure of events in the acquisition route of the image as well as events in the image as such, wherein events leading to a low gradability measure may be hemorrhages in retina and/or large scars, wherein the events are of a size that interfluence on the detection of the structures and other pathologies.
- the quality measure may be used as a criteria when automatically registering the images, in that poor quality images should be rejected and replaced by better quality images if possible.
- a method for registering at least two fundus images from the same eye comprising
- an important aspect of the present invention is a method for selecting fundus images for automatic screening because fundus images of a poor quality may lead to false positive or false negative detections in the image, due to disturbances of the algorithms because of the representation of the poor quality on the images.
- the present invention further relates to a method for selecting fundus images for automatic screening, comprising
- the invention further relates to a system comprising the algorithms capable of performing the methods according to the invention.
- Figure 1 - Figure 4 Four fundus images and their gradient contrast measures.
- the CV measure is the coefficient of variation of the gradient magnitudes in the image, and is a measure of the overall quality of the image.
- the robust CV measure is the CV of the gradients where outliers are removed.
- Figure 5 The robust CV of gradient magnitudes as a function of the gradient esti- mation method for the four fundus images displayed in Figure 1-4.
- Fovea The term is used in its normal anatomical meaning, i.e. the spot in retina having a great concentration of cones giving rise to the vision. Fovea and the term "macula lutea” are used as synonyms.
- Image The term image is used to describe a representation of the region to be examined, i.e. the term image includes 1-dimensional representations, 2-dimensional representations, 3-dimensionals representations as well as n-dimensional representatives. Thus, the term image includes a volume of the region, a matrix of the region as well as an array of information of the region.
- Optic nerve head The term is used in its normal anatomical meaning, i.e. the area in the fundus of the eye where the optic nerve enters the retina. Synonyms for the area are for example, the "blind" spot, the papilla, or the optic disc.
- Red-green-blue image The term relates to the image having the red channel, the green channel and the blue channel, also called the RBG image.
- ROI Region of interest.
- Visibility The term visibility is used in the normal meaning of the word, i.e. how visi- ble a lesion or a structure of the fundus region is compared to background and other structures/lesions.
- the images of the present invention may be any sort of images and presentations of the region of interest.
- Fundus image is a conventional tool for examining retina and may be recorded on any suitable means.
- the image is presented on a medium selected from dias, paper photos or digital photos.
- the image may be any other kind of representation, such as a presentation on an array of photo receptor elements, for example a CCD.
- the image may be a grey-toned image or a colour image, in a preferred embodi- ment the image is a colour image.
- the quality measure according to the present invention is an acquisition quality measure, i.e. a measure of the quality of the image related to the optical and elec- tronical parts of the acquisition.
- the acquisition quality relates to the optical way that is the route from in front of the fundus, such as from the vitreous body until the optical means of the acquisition apparatus, being it a camera or a CCD or the like.
- the optic system may be any part of the optic system from vitreous body, lens, cornea and camera or recorder.
- the electronical way relates to the route of the image in the camera, CCD or the like and into the computer capable of automatically measuring the quality.
- the quality measure is often a global measure in the meaning that the quality of the image as a total is given. It is however also possible to detect the image quality locally, for example for parts of the image. In particular for local quality events, such as locally presented artefacts this may be an advantage, since it may lead to rejection of for example only a part of the image during automatic screening and detec- tion of the rest of the image. For example a local quality measure may be denoted to more than one part of the image, and the image may then have different quality measures for different parts of the image.
- the quality measure may be calculated for parts of the image and give rise to a global measure, such as wherein at least one quality measure is calculated locally for more than one part of the fundus image, and then optionally summed up to a global measure.
- the acquisition quality measure may be calculated by any suitable measures, whereamong the following measures are preferred for the invention: the quality measure is selected from a contrast measure, a sharpness measure, an interlacing measure, a signal-to-noise ratio, a colour composition measure and an illumination measure.
- the quality measure is selected from a contrast measure, a sharpness measure, an interlacing measure, a signal-to-noise ratio, a colour composition measure and an illumination measure.
- important measures are measures capable of detecting too little or too high variation in the image, wherein little variation is often due to unsharp images, and high variation is often due to events in the image relating the artefacts.
- a preferred quality measure is a contrast measure, wherein the contrast measure may be the variation in the gradient magnitude image, preferably the coefficient-of- variation of the gradient magnitude.
- the contrast measure is preferred since it is sensible with respect to the variation of the image.
- the contrast measure is preferably calculated robustly, preferably by iteratively discarding out-liers. Out-liers may be observations deviating more than a number of standard deviations from the mean on a log-scale, wherein said number preferably is in the range of from 1-10.
- the contrast measure is a gradient contrast measure. This is a measure of the overall quality of the image, loosely speaking in terms of visibility of details in the image. Either poor gradient contrast may be related to a technical problem, such as wrong illumination of the retina, or it may be due to pathology, such as a cataract.
- the gradient contrast measure correlates well with the human graders interpretation of quality of a fundus image, as will be illustrated later.
- R and C are the number of rows and columns in the image.
- the heuristic idea is that visually, as well as in the automatic lesion detection algorithm, visibility of local features in the image is related to gradient magnitude. Large variation in the gradient magnitude, and hence a large CV, indicates that there is a large difference between sections with small gradients ("background”) and sections with large gradients (sections with features such as vessels or the optic nerve head).
- the CV may be calculated based on the original image, I, or on a function of the image I, such as a filtered image.
- a filtered image I may be used, said filtered image I being produced by
- W(r,c) is a window around pixel (r,c) .
- An unsharp masked image may be produced as
- the unsharp image does not contain general background variation.
- the gradient contrast measure may then be calculated from the unsharp image.
- a robust CV where outliers are iteratively removed, is preferably used.
- the outliers are defined as observations deviating more than 4 standard deviations from the mean on a log-scale. This criterion is implemented on the original scale by assuming a lognormal distribution of the gradient magnitudes; thus the magnitudes are not actually log-transformed in the algorithm.
- An iteration in the outlier-removal procedure is thus defined as follows: Let n denote the number of current observations not regarded as outliers, and let S and SS denote respectively the sum and squared sum of these. The minimal current observation L,dite is considered an outlier if
- ⁇ is the outlier-tolerance, preferably set to 4.0. If ⁇ min is an outlier, n , S and SS are updated and the next minimal observation is considered. If not, the maximal current observation ⁇ am is considered and is declared an outlier if
- CV rob denote the robust CV-measure.
- the CV rob -measure may be used to identify images of low gradient contrast.
- the original CV measure may however provide additional information in order to identify images with unusually large contrast. These are usually images that should not be processed automatically for detection of structures or lesions; either because the retina contains pathologies like laser scars, because the image contains artifacts such as text-labels printed on the retinal section or because a non-fundus image (for instance an image of the eyeball) is erroneously passed to the algorithm. It has been found that a CV larger than about 1.3 indicates an unusually large contrast and a CV mb less than about 0.6 corresponds to images of low quality
- Figures 1-4 display four images and their gradient contrast measures.
- the top left image is acquired from a patient who has cataract, which is seen as a blurring of the image.
- the algorithm will most likely fail on this image, and it should be excluded from automatic analysis; in fact, it should probably not be graded at all.
- the robust CV is 0.54 and the image will thus be excluded with the 0.6-threshold proposed above.
- the top right panel displays an image of excellent quality; the robust CV is 1.05.
- the bottom left panel illustrates a case where the robust and un-robust CV's differ significantly.
- the retina is poorly illuminated and the image should be discarded; the small bright reflections yield a large gradient variance, however, thus the un-robust CV is 0.71 and the image would be accepted based on this measure.
- the gradients along the circumference of the reflections are removed, and the CV is 0.58, whence the image is excluded from analysis.
- the bottom right image has unusually large CV's, which indicate that it is not a typical fundus image; in this cause, the large scars on the retina are the reason. As these will probably confuse the algorithm, the image should be returned for manual grading.
- the robustness of the quality measure has been evaluated in a gradient image ⁇ rc ⁇ .
- the gradient image can be obtained by different means; presently both polynomial and Gaussian gradient estimation are implemented in the lesion module, and the gradients will to some extent depend on the parameter settings of the methods.
- the measure for a range of parameters for each of the four images displayed in Fig- ure 1-4 has been calculated. The results are listed in Table 1 and displayed as a plot in Figure 5.
- the gradients are estimated using either a Gaussian kernel or polynomial kernels.
- Gaus x refers to a Gaussian kernel with standard deviation ⁇ : units, wherein one unit is approximately 10 ⁇ m;
- Poly x y refers to a polynomial filter with order : and kernel-size y pixels. Note that for the poly-kernels the kernel-size is not scaled with the image scale.
- the original images are of scale 0.9, 0.4, 1.0 and 0.4 respectively, and the 0.4 images are resampled to scale 0.6 prior to the gradient estimation.
- the CV's generally increase with the degree of smoothing (i.e. they increase with the width of the Gaussian kernel, and decrease with the poly-kernel-size), which may be explained by a general decrease in the mean gradient magnitude.
- the CV's are reasonably robust when widths in the normal range 1.3 - 2.0 are used.
- the polynomial kernels there is a more pronounced variation, which has no systematic pattern between the images.
- the kernel-sizes are measured in pixels, and hence it is a bit difficult to compare the variation for images of different scales.
- the CV's are somewhat sensitive to the choice of kernel-size and order. This may be related to the fundamental difference between the Gaussian and polynomial kernels.
- the interlacing effect may preferably be calculated by measures derived from the Fourier transformation.
- the interlacing measure is based on the ratio of the power at n predetermined frequencies in the column direction and the row direc- tion of the image, wherein n is an integer > 1.
- this effect may be calculated by the ratio of the powers at frequency 1 ⁇ in the vertical and horizontal directions.
- H carefully denote the discrete Fourier trans- formation of the image at frequencies (rlR,clC) pixel-sides '1 ,
- h J k denotes the intensity in row
- column k and R and C are the number of rows and columns respectively. Assuming that R and C are even (which is usu- ally the case), the interlacing ratio is then given by
- the IL is larger than 1 since interlacing is typically seen as horizontal stripes in the image. Image of good quality have IL ratios up to about 10. The interlacing effects are clearly visible, when the IL ratio is higher than 20. In a few images data, every second line may be missing; these have an IL ratio in the order of 10000.
- a quality measure selected from illumination measure is usually a local measure, which is used to identify regions that are too dark or too bright, due to inhomogene- ous illumination of the retina, but may of course be applied as a global measure as well, i.e. the whole image is poorly illuminated.
- the aim is to use this both locally, for example for removing these areas prior to detecting lesions, and globally as a measure of the overall illumination quality.
- the illumination quality of the image may be calculated by measuring the saturation of the image using at least one of the channels, preferably at least two of the channels, such as the red and the green channel of the image.
- the quality is calculated using the saturation of the red and green channels,
- alow saturation value is a quality measure for improper illumination of the image.
- the blue channel is preferably omitted from the saturation-value, however, as this channel in general contains very little information. In the following, saturation will always refer to red/green-saturation in the above sense.
- the saturation image is mean-filtered with kernel size 51 "units wherein one unit is approximately 10 ⁇ m" and normalized by the mean saturation within the ROI. This normalization ensures that only areas that have low saturation values relative to the global image saturation, are masked out. Improperly illuminated images, are image having sections of the image, which touch the boundary of the ROI, and have normalized saturation value less than about 0.55 are improper illuminated. These regions may be identified by growing regions from the boundary pixels in the saturation image. The total area of these sections, relative to the size of the ROI, constitutes the global illumination quality of the image. For automatic screening the image should be excluded from analysis if a large part of the retina, more than 40% for instance, is badly illuminated.
- This is preferably a local measure, which is used to identify sections with low signal- to-noise ratio.
- the aim is to use this both locally, for removing these areas prior to detecting lesions, and globally as a measure of the overall SNR quality.
- the "signal” (i.e. the signal and the noise) is defined as un-sharp masking of the original image, with a kernel-size of for example 191 pixels in the imgaes, wherein one unit is approximately 10 ⁇ m. Thus, low-frequency components are removed from the image.
- ⁇ S, ⁇ denote the resulting image.
- the noise image ⁇ N,. ⁇ is defined by the residual image after subtracting the 3 ⁇ 3 median-filtered signal from ⁇ S, ⁇ .
- the local signal-to-noise ratio is defined as where V t is a region of size 51 x 51 around pixel i , and S, is the average of S over
- the S ⁇ R-image is finally smoothed with a mean-filter of for example size 51 x 51 to obtain the final S ⁇ R-image.
- the noise level will locally be too high for automated analysis of the image.
- either one may use the average S ⁇ R or the fraction of the retinal part of the image that has a S ⁇ R level lower than 1.3. The latter will often be most appropriate for robustness reasons.
- the quality measure may include specific measures for identifying the artefacts, such as templates searching for drops of liquid or secretes that may be positioned on the camera lens, text lines inadvertently placed by the electronical system over the image giving rise to high variations may also be detected by templates or by searching for extremas in single colour channel image functions, spots due to camera artefacts, pixel errors due to camera or CCD errors, reflections that may arise from healthy individuals vitreous body and lead to bright areas around fovea and extensions therefrom.
- specific measures for identifying the artefacts such as templates searching for drops of liquid or secretes that may be positioned on the camera lens, text lines inadvertently placed by the electronical system over the image giving rise to high variations may also be detected by templates or by searching for extremas in single colour channel image functions, spots due to camera artefacts, pixel errors due to camera or CCD errors, reflections that may arise from healthy individuals vitreous body and lead to bright areas around fovea and extensions therefrom.
- artefacts according to this invention is any presentation on the image, not being part of the scene of the image.
- artefacts may for example be one or more of the following: undesired objects projected onto the image, dispersion, diffraction and/or reflection of the optic system.
- Examples of the undesired objects projected onto the image is selected from eye lash, or edge of the iris, text in the image, digital errors. Pathologies
- Pathologies, indicators of which, may be quantified or qualified by the present invention are generally pathologies relating to the parts of the eye lying in front of the fundus itself. However, some pathologies may also be present in the fundus region to such an extent that it incluences the overall quality.
- pathology influencing the quality in a systematic manner may be quantified or qualified by calculating a quality measure and correlating it with standards.
- the pathology indicators may be indicators for one or more of the following pathologies: glaucoma, diabetic retinopathy, amotio retina, hemorrhages of the retina, cataract, scars, photo-coagulation scars, laser scars, pathological vessel growth.
- the quality measures comprise calculating of at least a contrast measure and an interlacing measure.
- Calculation of a quality measure and estimation of a gradability measure may be used for identifying regions in the image, that should be masked before detecting the presence or absence of a structure and/or a pathological condition in parts of a fundus image, said method comprising
- the method is preferably combined with methods for detecting the specific structures, such as vessels, the optic nerve head and the fovea, and detecting lesions, such as microaneurysms and exudates, which show up on fundus images as generally "dot shaped" (i..e substantially circular) areas. It is of interest to distinguish between such microaneurysms and exudates, and further to distinguish them from other pathologies in the image, such as "cotton wool spots" and hemorrhages.
- the quality measure may be used in a method for registering at least two fundus images from the same eye, comprising
- the present invention relates to a method for selecting fundus images for automatic screening, comprising d) acquiring a fundus image,
- images with poor quality should be returned to the user with a message that the image could not be processed.
- Many images, which have poor gradient contrast, are also judged to be ungradeable by the human grader, and by returning the images immediately, the algorithms will treat data much as a human grader would. It is within the scope of the invention that the quality measure is conducted immediately after the recording of the image, to let the photographer acquire a new image if the first one was unacceptable.
- the invention also relates to a method for detecting the presence or absence of a structure and/or a pathological condition in parts of a fundus image taking into account the quality measure and gradability measure, said method comprising
- the invention further relates to a system comprising the algorithms capable of per- forming the methods according to the invention.
- the system according to the invention may be any system capable of conducting the method as described above as well as any combinations thereof within the scope of the invention.
- the system may include algorithms to perform any of the methods de- scribed above.
- a graphical user interface module may operate in conjunction with a display screen of a display monitor.
- the graphical user interface may be implemented as part of the processing system to receive input data and commands from a conventional key- board and mouse through an interface and display results on a display monitor.
- many components of a conventional computer system have not been discussed such as address buffers, memory buffers, and other standard control circuits because these elements are well known in the art and a detailed description thereof is not necessary for understanding the present invention.
- Pre-acquired image data can be fed directly into the processing system through a network interface and stored locally on a mass storage device and/or in a memory. Furthermore, image data may also be supplied over a network, through a portable mass storage medium such as a removable hard disk, optical disks, tape drives, or any other type of data transfer and/or storage devices which are known in the art.
- a parallel computer platform having multiple processors is also a suitable hardware platform for use with a system according to the present invention.
- Such a configuration may include, but not be limited to, par- allel machines and workstations with multiple processors.
- the processing system can be a single computer, or several computers can be connected through a communications network to create a logical processing system.
- the present system allows the grader, that is the person normally grading the images to identify the structures and lesions more rapidly and securely.
- the system described in the following is a more reliable system, wherein it is also possible to arrange for acquisition of the images at one location and examining them at another location.
- the images may be recorded by any optician or physician or elsewhere and be transported to the examining specialist, either as photos or the like or on digital media. Accordingly, by use of the present system the need for decentral centers for recording the image, while the maintaining fewer expert graders could be realised.
- the network may carry data signals including control or image adjustment signals by which the expert examining the im- ages at the examining unit directly controls the image acquisition occurring at the recordation localisation, i.e. the acquisition unit.
- control signals such command signals as zoom magnification, steering adjustments, and wavelength of field illumination may be selectively varied remotely to achieve desired imaging effect.
- questionable tissue structures requiring greater magnification or a different perspective for their elucidation may be quickly resolved without ambiguity by varying such control parameters.
- by switching illumination wavelengths views may be selectively taken to represent different layers of tissue, or to accentuate imaging of the vasculature and blood flow characteristics.
- control signals may include time varying signals to initiate stimulation with certain wavelengths of light, to initiate imaging at certain times after stimulation or delivery of dye or drugs, or other such precisely controlled imaging protocols.
- the digital data signals for these operations may be interfaced to the ophthalmic equipment in a relatively straightforward fashion, provided such equipment already has initiating switches or internal digital circuitry for controlling the particular parameters involved, or is capable of readily adapting electric controls to such control parameters as system focus, illumination and the like.
- the imaging and ophthalmic treatment in- strumentation in this case will generally include a steering and stabilization system which maintains both instruments in alignment and stabilized on the structures appearing in the field of view.
- the invention contemplates that the system control further includes image identification and correlation software which allows the ophthalmologist at site to identify particular positions in the retinal field of view, such as pinpointing particular vessels or tissue structures, and the image acquisition computer includes image recognition software which enables it to identify patterns in the video frames and correlate the identified position with each image frame as it is acquired at the acquisition site.
- the image recognition software may lock onto a pattern of retinal vessels.
- the invention further contemplates that the images provided by acquisition unit are processed for photogrammetric analysis of tissue features and optionally blood flow characteristics. This may be accomplished as follows. An image acquired at the recordation unit is sent to an examination unit, where it is displayed on the screen. As indicated schematically in the figure, such image may include a network of blood vessels having various diameters and lengths. These vessels include both arterial and venous capillaries constituting the blood supply and return network.
- the workstation may be equipped with a photogrammetric measurement program which for example may enable the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the width of the vessel and the subvessels to which it is connected, as well as the coordinates thereof.
- a photogrammetric measurement program which for example may enable the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the width of the vessel and the subvessels to which it is connected, as well as the coordinates thereof.
- the software for noting coordinates from the pixel positions and linking displayed features in a record, as well as submodules which determine vessel capacities and the like, are straightforward and readily built up from photogrammetric program techniques.
- Work station protocols may also be implemented to automatically map the vasculature as described above, or to compare two images taken at historically different times and identify or annotate the changes which have occurred, highlighting for the operator features such as vessel erosion, tissue which has changed colour, or other differences.
- a user graphical interface allows the specialist to type in diagnostic indications linked to the image, or to a particular feature appearing at a location in the image, so that the image or processed version of it becomes more useful.
- the relative health of the vessel, its blood carrying capacity and the like may also be visually observed and noted.
- This photogrammetric analysis allows a road map of the vasculature and its capacity to be compiled, together with annotations as to the extent of tissue health or disease apparent upon such inspection.
- a very precise and well-annotated medical record may be readily compiled and may be compared to a previously taken view for detailed evidence of changes over a period of time, or may be compared, for example, to immediately preceding angiographic views in order to assess the actual degree of blood flow occurring therein.
- the measurement entries at examination unit become an annotated image record and are stored in the central library as part of the patient's record.
- the present invention changes the dynamics of patient access to care, and the efficiency of delivery of ophthalmic expertise in a manner that solves an enormous current health care dilemma, namely, the obstacle to proper universal screening for diabetic retinopathy.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Theoretical Computer Science (AREA)
- Biomedical Technology (AREA)
- Quality & Reliability (AREA)
- General Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Signal Processing (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA200101450 | 2001-10-03 | ||
DKPA200101450 | 2001-10-03 | ||
US37501602P | 2002-04-25 | 2002-04-25 | |
US60/375,016 | 2002-04-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003030073A1 true WO2003030073A1 (fr) | 2003-04-10 |
Family
ID=26069072
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DK2002/000660 WO2003030073A1 (fr) | 2001-10-03 | 2002-10-03 | Mesure de qualite |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2003030073A1 (fr) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2470727A (en) * | 2009-06-02 | 2010-12-08 | Univ Aberdeen | Processing retinal images using mask data from reference images |
GB2491941A (en) * | 2011-10-24 | 2012-12-19 | Iriss Medical Technologies Ltd | Image processing to detect abnormal eye conditions |
WO2013078582A1 (fr) * | 2011-11-28 | 2013-06-06 | Thomson Licensing | Mesure de qualité vidéo considérant de multiples artéfacts |
US8879813B1 (en) | 2013-10-22 | 2014-11-04 | Eyenuk, Inc. | Systems and methods for automated interest region detection in retinal images |
US20150124218A1 (en) * | 2011-09-20 | 2015-05-07 | Canon Kabushiki Kaisha | Image processing apparatus, ophthalmologic imaging apparatus, image processing method, and storage medium |
WO2016032397A1 (fr) * | 2014-08-25 | 2016-03-03 | Agency For Science, Technology And Research (A*Star) | Procédés et systèmes permettant d'évaluer des images rétiniennes et d'obtenir des informations à partir d'images rétiniennes |
WO2016040317A1 (fr) * | 2014-09-08 | 2016-03-17 | The Cleveland Clinic Foundation | Analyse automatisée d'images angiographiques |
EP2856930A4 (fr) * | 2012-05-04 | 2016-06-15 | Uni Politècnica De Catalunya | Procédé permettant de détecter des pertes de la fonction visuelle |
US9384416B1 (en) | 2014-02-20 | 2016-07-05 | University Of South Florida | Quantitative image analysis applied to the grading of vitreous haze |
US9905008B2 (en) | 2013-10-10 | 2018-02-27 | University Of Rochester | Automated fundus image field detection and quality assessment |
US10278859B2 (en) | 2014-10-17 | 2019-05-07 | The Cleveland Clinic Foundation | Image-guided delivery of ophthalmic therapeutics |
-
2002
- 2002-10-03 WO PCT/DK2002/000660 patent/WO2003030073A1/fr not_active Application Discontinuation
Non-Patent Citations (2)
Title |
---|
CIDECIYAN A V: "REGISTRATION OF OCULAR FUNDUS IMAGES", IEEE ENGINEERING IN MEDICINE AND BIOLOGY MAGAZINE, IEEE INC. NEW YORK, US, vol. 14, no. 1, 1995, pages 52 - 58, XP000486770, ISSN: 0739-5175 * |
LEE S C ET AL: "Comparison of diagnosis of early retinal lesions of diabetic retinopathy between a computer system and human experts.", ARCHIVES OF OPHTHALMOLOGY. UNITED STATES APR 2001, vol. 119, no. 4, April 2001 (2001-04-01), pages 509 - 515, XP008008870, ISSN: 0003-9950 * |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2470727A (en) * | 2009-06-02 | 2010-12-08 | Univ Aberdeen | Processing retinal images using mask data from reference images |
US11058293B2 (en) * | 2011-09-20 | 2021-07-13 | Canon Kabushiki Kaisha | Image processing apparatus, ophthalmologic imaging apparatus, image processing method, and storage medium |
US20150124218A1 (en) * | 2011-09-20 | 2015-05-07 | Canon Kabushiki Kaisha | Image processing apparatus, ophthalmologic imaging apparatus, image processing method, and storage medium |
GB2491941A (en) * | 2011-10-24 | 2012-12-19 | Iriss Medical Technologies Ltd | Image processing to detect abnormal eye conditions |
GB2491941B (en) * | 2011-10-24 | 2013-09-25 | Iriss Medical Technologies Ltd | System and method for identifying eye conditions |
US9149179B2 (en) | 2011-10-24 | 2015-10-06 | Iriss Medical Technologies Limited | System and method for identifying eye conditions |
WO2013078582A1 (fr) * | 2011-11-28 | 2013-06-06 | Thomson Licensing | Mesure de qualité vidéo considérant de multiples artéfacts |
US9924167B2 (en) | 2011-11-28 | 2018-03-20 | Thomson Licensing | Video quality measurement considering multiple artifacts |
EP2856930A4 (fr) * | 2012-05-04 | 2016-06-15 | Uni Politècnica De Catalunya | Procédé permettant de détecter des pertes de la fonction visuelle |
US9905008B2 (en) | 2013-10-10 | 2018-02-27 | University Of Rochester | Automated fundus image field detection and quality assessment |
US8885901B1 (en) | 2013-10-22 | 2014-11-11 | Eyenuk, Inc. | Systems and methods for automated enhancement of retinal images |
US9008391B1 (en) | 2013-10-22 | 2015-04-14 | Eyenuk, Inc. | Systems and methods for processing retinal images for screening of diseases or abnormalities |
US9002085B1 (en) | 2013-10-22 | 2015-04-07 | Eyenuk, Inc. | Systems and methods for automatically generating descriptions of retinal images |
US8879813B1 (en) | 2013-10-22 | 2014-11-04 | Eyenuk, Inc. | Systems and methods for automated interest region detection in retinal images |
US9384416B1 (en) | 2014-02-20 | 2016-07-05 | University Of South Florida | Quantitative image analysis applied to the grading of vitreous haze |
WO2016032397A1 (fr) * | 2014-08-25 | 2016-03-03 | Agency For Science, Technology And Research (A*Star) | Procédés et systèmes permettant d'évaluer des images rétiniennes et d'obtenir des informations à partir d'images rétiniennes |
EP3186779A4 (fr) * | 2014-08-25 | 2018-04-04 | Agency For Science, Technology And Research (A*star) | Procédés et systèmes permettant d'évaluer des images rétiniennes et d'obtenir des informations à partir d'images rétiniennes |
US10325176B2 (en) | 2014-08-25 | 2019-06-18 | Agency For Science, Technology And Research | Methods and systems for assessing retinal images, and obtaining information from retinal images |
WO2016040317A1 (fr) * | 2014-09-08 | 2016-03-17 | The Cleveland Clinic Foundation | Analyse automatisée d'images angiographiques |
US10628940B2 (en) | 2014-09-08 | 2020-04-21 | The Cleveland Clinic Foundation | Automated analysis of angiographic images |
US10278859B2 (en) | 2014-10-17 | 2019-05-07 | The Cleveland Clinic Foundation | Image-guided delivery of ophthalmic therapeutics |
US10888455B2 (en) | 2014-10-17 | 2021-01-12 | The Cleveland Clinic Foundation | Image-guided delivery of ophthalmic therapeutics |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7583827B2 (en) | Assessment of lesions in an image | |
Tobin et al. | Detection of anatomic structures in human retinal imagery | |
Hubbard et al. | Methods for evaluation of retinal microvascular abnormalities associated with hypertension/sclerosis in the Atherosclerosis Risk in Communities Study | |
Quellec et al. | Optimal filter framework for automated, instantaneous detection of lesions in retinal images | |
Chun et al. | Objective assessment of corneal staining using digital image analysis | |
US20120065518A1 (en) | Systems and methods for multilayer imaging and retinal injury analysis | |
Bartlett et al. | Use of fundus imaging in quantification of age-related macular change | |
Köse et al. | A statistical segmentation method for measuring age-related macular degeneration in retinal fundus images | |
US20220319708A1 (en) | Automated disease identification based on ophthalmic images | |
EP1716804A1 (fr) | Procede et instrument de mesure optique du fonctionnement de la retine | |
WO2003030073A1 (fr) | Mesure de qualite | |
CN113768461A (zh) | 一种眼底图像分析方法、系统和电子设备 | |
Zangwill et al. | Optic nerve imaging: recent advances | |
WO2003030075A1 (fr) | Detection de papille optique dans une image de fond d'oeil | |
Azar et al. | Classification and detection of diabetic retinopathy | |
Peli | Electro-optic fundus imaging | |
WO2004082453A2 (fr) | Determination de lesions dans une image | |
WO2003030101A2 (fr) | Detection de vaisseaux sur une image | |
Noronha et al. | Automated diagnosis of diabetes maculopathy: a survey | |
DK1444635T3 (en) | Assessment of lesions in an image | |
Mohammadi et al. | The computer based method to diabetic retinopathy assessment in retinal images: a review. | |
Kaur et al. | Preliminary analysis and survey of retinal disease diagnosis through identification and segmentation of bright and dark lesions | |
Balasubramanian et al. | Algorithms for detecting glaucomatous structural changes in the optic nerve head | |
Dhayanithi et al. | Enhanced Detection of Eye Coloboma through Global Attention U-Net (GA-UNet) for Accurate Ophthalmic Image Segmentation | |
Rehkopf et al. | Ophthalmic image processing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MK MN MW MX MZ NO NZ OM PH PT RO RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG US UZ VC VN YU ZA ZM |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: JP |