+

WO2003030075A1 - Detection of optic nerve head in a fundus image - Google Patents

Detection of optic nerve head in a fundus image Download PDF

Info

Publication number
WO2003030075A1
WO2003030075A1 PCT/DK2002/000663 DK0200663W WO03030075A1 WO 2003030075 A1 WO2003030075 A1 WO 2003030075A1 DK 0200663 W DK0200663 W DK 0200663W WO 03030075 A1 WO03030075 A1 WO 03030075A1
Authority
WO
WIPO (PCT)
Prior art keywords
optic nerve
image
nerve head
candidate
head area
Prior art date
Application number
PCT/DK2002/000663
Other languages
French (fr)
Inventor
Per Rønsholt ANDRESEN
Johan Doré HANSEN
Michael Grunkin
Niels Vaever Hartvig
Jannik Godt
Ebbe Sørensen
Soffia Björk Smith
Original Assignee
Retinalyze Danmark A/S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Retinalyze Danmark A/S filed Critical Retinalyze Danmark A/S
Publication of WO2003030075A1 publication Critical patent/WO2003030075A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection

Definitions

  • the present invention relates to a method for assessing the presence or absence of the optic nerve head in images of the ocular fundus (the back of the eye), hereinaf- ter referred to as the fundus.
  • Diabetes is the leading cause of blindness in working age adults. It is a disease that, among its many symptoms, includes a progressive impairment of the peripheral vascular system. These changes in the vasculature of the retina cause progressive vision impairment and eventually complete loss of sight. The tragedy of diabetic reti- nopathy is that in the vast majority of cases, blindness is preventable by early diagnosis and treatment, but screening programs that could provide early detection are not widespread.
  • the present invention relates to a method for assessing the presence or absence of the optic nerve head in fundus images.
  • Current methods may be able to detect the optic nerve head in many normal images, but the methods are not reliable when applied to images not containing the optic nerve.
  • the method should be robust in the sense that it should be applicable to a wide variety of images independent of illumination, presence of symptoms of diseases and/or artefacts of the image.
  • the present invention provides a method, comprising
  • n candidate optic nerve head area(s) wherein n is an integer > 1 ,
  • step d) classifying the candidate optic nerve head area selected in step c) with respect to a threshold as the optic nerve head area or not.
  • the present invention provides a method, comprising
  • n is an integer > 1 ,
  • step e) selecting the highest ranking candidate optic nerve head area fulfilling the validating criteria, f) classifying the candidate optic nerve head area selected in step e) with respect to a threshold as the optic nerve head area or not.
  • n is a small number, such as less than 15, more prefera- bly less than 10, more preferably less than 5. Normally n is 2, 3, or 4, most preferably 4.
  • the invention relates to a system capable of conducting the method, such as a system for assessing the presence or absence of the optic nerve head in a fundus image, comprising
  • the method according to the invention may be applied in several procedures for identifying structures or indications of diseases or abnormal conditions.
  • the invention relates to the use of the method of assessing the optic nerve head for establishing a coordinate system on an image of the ocular fundus, comprising
  • assessing the presence of the fovea region arranging a coordinate system having one axis parallel with an axis through the optic nerve head and the fovea.
  • Such a coordinate system allows a precise location of other structures in the fundus image, thereby providing more exact diagnosis, for example by establishing a coordinate system by the method, and grading the lesions with respect to distance to the fovea region.
  • the localisation of for example lesions with respect to the fovea region may also be accomplished by another method according to the invention for establishing a coordinate system on an image of the ocular fundus, comprising
  • the optic nerve head may be used for registering the images, and accordingly, the present invention relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
  • the detection of optic nerve head may be used for detecting vessels in the image, optionally as an iterative method.
  • the present invention relates to a method for detecting vessels in a fundus image, comprising a) estimating the localisation of the optic nerve head region by a method as defined above,
  • steps a) and b) optionally repeating steps a) and b) at least once.
  • An important aspect of the present invention is the application of the method in a method for assessing the presence or absence of lesions in a fundus image.
  • the detection of the optic nerve head is used to for example mask the optic nerve head area in order to avoid false positive lesions very likely detected in the neighbourhood of the optic nerve head.
  • the invention also relates to a method for assessing the presence or absence of lesions in a fundus image, comprising
  • the invention relates to a method for detecting indicators of glaucoma in a fundus image, comprising
  • Fig. 1 is a fundus image.
  • Fig. 2 is an unsharp filtered image of the fundus of Fig. 1.
  • Fig. 3 is an image of scoring of branching points in the vessel tree of the fundus of Fig. 1. Starting points are established at highest scoring branching points.
  • Fig. 4 is the fundus image of Fig. 1 wherein a filter enhancing sagital structures is applied.
  • the vessels have all been denoted 1 pixel in width (Medial Axis Transformation - MAT). Starting points are established in maxima of the image.
  • Fig. 5 is the fundus image of Fig. 1 wherein a filter enhancing sagital structures is applied. Starting points are established in maxima of the image.
  • Fig. 6 is an image of the fundus in Fig. 1 , wherein intensity maximum has given rise to a starting point.
  • Fig. 7 is an image of the fundus in Fig. 1, wherein variance maximum of the image has given rise to a starting point.
  • Fig. 8 is an image of the fundus in Fig. 1 , wherein the vessels have been masked.
  • Fig. 9 is an image of the fundus in Fig. 1, wherein the position of all starting points detected by the various methods is shown.
  • Fig. 10 is an image showing excluding areas of the fundus of Fig. 1 based on tangential vessel exclusion.
  • Fig. 11 shows the optimal circles found from the starting points, and the power assigned to the circles.
  • Fig. 12 shows the circle being accepted as the optic nerve head.
  • Fig. 13 shows 4 different fundus images, wherein the optic nerve head is present in two of the images, and absent in the remaining images.
  • Figure 14 The gradient orthogonal to a circle is used as cost function to find the border of the ONH.
  • the cost function is computed as the sum of individual gradients, where the gradient is computed as the difference between the intensity value at the end- and start-point.
  • the cost is computed as the difference between the sum of the intensity at outer- and inner-circle-points, (a) and (b) are equal (as long as the computations are linear).
  • Figure 15 The figure shows the correspondence between the circle power (sums of the intensities at the 'nCircleResolution' circle-points) and the radial distance from the centre of the ONH. The optimal radius is found at the highest gradient.
  • Figure 16 The circle power seen visually in Figure 11 here plotted as a graph.
  • FIG. 18 The schematic drawing shows the tangential vessel exclusion algorithm. An exclusion area is drawn around all significant vessels in the fundus image. The ONH is more likely to be found outside the tangential exclusion areas.
  • Figure 19 The figure depicts how the neighborhood around an initial vessel segment is expanded inside the confidence band.
  • the endpoints of the straight vessels segment are the expanded node points, which are farthest away.
  • FIG. 20 The figure shows an ONH candidate and the crossing of an arcade (vessel segment).
  • Image The term image is used to describe a representation of the region to be examined, i.e. the term image includes 1 -dimensional representations, 2-dimensional representations, 3-dimensionals representations as well as n-dimensional. Thus, the term image includes a volume of the region, a matrix of the region as well as an array of information of the region.
  • the term representative means that the starting point may represent a point or an area of the optic nerve head.
  • ROI Region of interest. Visibility: The term visibility is used in the normal meaning of the word, i.e. how visible a lesion or a structure of the fundus region is compared to background and other structures/lesions.
  • Optic nerve head The term is used in its normal anatomical meaning, i.e. the area in the fundus of the eye where the optic nerve enters the retina. Synonyms for the area are, for example, the "blind" spot, the papilla, or the optic disk.
  • Fovea The term is used in its normal anatomical meaning, i.e. the spot in retina having a great concentration of cones giving rise to the vision. Fovea and the term
  • macula lutea are used as synonyms. Also fovea has an increased pigmentation.
  • Red-green-blue image The term relates to the image having the red channel, the green channel and the blue channel, also called the RBG image.
  • Starting point The term describes a point or area for starting the search for candidate optic nerve head areas.
  • the term starting point is thus not limited to a mathematical point, such as not limited to a pixel, but merely denotes a localisation for starting a search.
  • the images of the present invention may be any sort of images and presentations of the fundus.
  • the image is presented on a medium selected from dias, paper photos or digital photos.
  • the image may be any other kind of representation, such as a presentation on an array of elements, for example a CCD.
  • the image may be a grey-toned image or a colour image; in a preferred embodiment the image is a colour image.
  • the green and/or the red channel is used for assaying the presence or absence of the optic nerve head area, and more preferred an average of the green and the red channel is used.
  • the candidate optic nerve head area(s) may be detected by any suitable method, for example by filtering, by template matching, by establishing starting points, and from said starting points grow regions and/or by other methods search for candidate areas, and/or combinations thereof.
  • the candidate optic nerve head area(s) are detected by establishing starting points, and from the starting points searching for candidate areas.
  • the starting points may be established by a variety of suitable methods and of combinations of such methods.
  • the establishment of starting points is conducted by applying the same approach as the present inventors have experienced that the human eye uses for identifying the optic nerve head when examining the fundus image manually. This approach is principally a combination of following the vessels in the image and looking for a bright area.
  • the image may be filtered and/or blurred before establishing or as a part of establishing starting points for the method.
  • the low frequencies of the image may be removed before establishing starting points.
  • the image may be un- sharp filtered, for example by median or mean filtering the image and subtracting the filtered result from the image.
  • the starting points may be established as extrema of the image.
  • the image is however a filtered image, wherein the filtering may be linear and/or non-linear.
  • the filtering method is a filtering method using templates, wherein the template may exhibit any suitable geometry for identifying the optic nerve head.
  • templates are circular templates having a diameter of the expected optic nerve head ⁇ some percent, for example + 30 %. It is within the scope of the invention, that the image may be filtered with one or more filters before establishing starting points, or as a part of the step of establishing starting points. Thus, in one embodiment of the invention starting points are established by combining two or more filters.
  • the extrema may thus be identified individually by one or more of several methods, such as the following:
  • Searching for the part of the vessels where no further branching points is detected that is establishing at least one extremum in the image based on vessel branching points, preferably establishing at least one maximum in the image based on vessel branching points.
  • the blood vessels branching perpendicularly at the optic nerve head area may be designated longitudinal blood vessels or sagitally oriented vessels, and the optic nerve head may be detected by searching for the longitudinal vessels.
  • one method may be establishing at least one extremum in the image based on a filter enhancing sagital structures, preferably establishing at least one maximum in the image based on a filter enhancing sagital structures.
  • the sagital filtering may be conducted on the vessels present in the image without any transformation. However, the sagital filtering may additionally or only be conducted on a thinned vessel image, for example wherein the vessels are all thinned to one pixel.
  • the vascular system may be isolated from the rest of the image context and skeletonized, i.e. the network of the vessels identified.
  • One method for tracking vessels is may be to extract linear components that may be regarded as those of the blood vessels. That is, groups of pixels forming bright lines on the dark background are considered to be images of the blood vessels.
  • Another method for tracking vessels is a method wherein use is made of the fact that the vessels are linear in a local neighbourhood wherein different filter matrices have different orientations. The localisation and orientation of such line elements may be determined using a template matching approach sometimes referred to as match filters).
  • a preferred method for tracking vessels is by tracking individual vessels from starting points representative for vessels, and iteratively grow the vessel network of the retina.
  • a preferred embodiment hereof is described in a co-pending PCT patent application entitled "Vessel tracking" of RETINALYZE A/S.
  • the estimation of candidate optic nerve head areas is adjusted with respect to vessels appearing in the image.
  • adjusted is meant either that an iterative estimation of optic nerve head and vessels is conducted, wherein for each iteration, the significance of the localisation of both increases towards a maximum, or that knowledge of the anatomical localisation of vessels adjacent the optic nerve head is used for locating and/or validating the position of the optic nerve head.
  • the estimation of candidate optic nerve head areas is preceded by detection of vessels in the image. Having identified the blood vessels in the image, it is desirable to be able to distinguish between veins and arteries among the blood vessels. This can be important, for example in the diagnosis of venous beading and focal arteriolar narrowing.
  • the vascular system observed in the ocular fundus images is by nature a 2-dimen- sional projection of a 3-dimensional structure. It is quite difficult in principle to distinguish veins from arteries, solely by looking at isolated vessel segments. However, it has been discovered that effective separation can be achieved by making use of the fact that, individually, the artery structure and the vein vessel structures is each a perfect tree (i.e., there is one unique path along the vessels from the heart to each capillary and back).
  • the artery and vein structures are each surface filling, so that all tissue is either supplied or drained by specific arteries or veins, respectively.
  • a method for distinguishing veins from arteries is described in WO 00/65982 to Tor- sana Diabetes Diagnostic A/S and is based on the realisation that crossings of vessel segments are, for practical purposes, always between a vein and an artery (i.e. crossings between arteries and arteries or between veins and veins are, for practical purposes, non-existent).
  • the optic nerve head will normally be one of the brightest areas in the image, or at least locally the brightest area.
  • a method may be establishing at least one intensity extremum in the image, preferably at least one intensity maximum.
  • causes for failure is for example the very normal shadow in the region of the optic nerve head, wherein the shadow is caused by the nose of the person having his or her fundus examined. Therefore, in a preferred embodiment at least one local intensity maximum is established.
  • the extrema may be established on any image function, such as wherein the image function is the unsharped image, the red chan- nel image, the green channel image, an average of the green and the red channel or any combinations thereof.
  • the method may include establishing at least one variance extremum in the image, preferably establishing at least one variance maximum in the image. For the same reasons as described with respect to the intensity at least one local variance maximum is established.
  • the ex- trema may be established on any image function, such as wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel or any combinations thereof.
  • the variance extremum is a weighted variance maximum, or a local variance maximum, more preferably a local weighted variance maximum.
  • Another method for establishing starting points may be random establishment of starting points, wherein the ultimate random establishment, is establishing a starting point in substantially each pixel of the image.
  • a random establishment may be combined with any of the methods discussed above.
  • the starting points may be established as grid points, such as evenly distributed or unevenly distributed grid points. Again this method may be combined with any of the methods of establishing extrema in the image and/or random establishment.
  • starting points are established by more than one of the methods described in order to increase the probability of assessing the correct localisation of the optic nerve head when present in the image, also with respect to images having less optimal illumination or presenting other forms of less optimal image quality.
  • starting points are established by at least two of the steps or methods described above, such as by at least three of the steps or methods described above, such as by at least four of the steps or methods de- scribed above, such as at least five of the steps or methods described above. Search for best candidate
  • the search may be for a geometric shape representative for the optic nerve head.
  • the best candidate optic nerve head area may be represented by the periphery of the area, such as a closed curve, such as a circle, an ellipse, a snake, a polygonal, wherein the latter represent an irregular geometrical form.
  • the best candidate optic nerve head area is represented as an open curve, such as a circle, an ellipse, a snake, a polygonal.
  • the best candidate optic nerve head area may be repre- sented by the area as such, for example represented as a closed area, such as a circle, an ellipse, a snake, a polygonal.
  • the search initiated in step b) is a search for a centre of a best matching circle, wherein the centre is positioned in a search region of a predetermined size established around each starting point.
  • the centre of the best matching circle is not necessarily the starting point, but rather that the centre is positioned within the search region around the starting point.
  • the search for best matching circles is conducted by using searching for a variety of radii in the search region.
  • the radius of the best matching circle n the range of + 100 % of the expected diameter of the optic nerve head, such as n the range of + 90 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 80 % of the expected diameter of the optic nerve head, such as n the range of + 70 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 60 % of the expected diameter of the optic nerve head, such as n the range of + 50 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 40 % of the expected diameter of the optic nerve head, such as n the range of + 35 % of the expected diameter of the optic nerve head, such as n the range of + 30 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 25 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 20 % of the expected diameter of the optic nerve head, such as n the range of + 15 % of
  • the expected optic nerve head diameter may be predetermined as an absolute fig- ure, however in order to apply the method to a variety of images taken with different resolutions etc. it is more appropriate to estimate the expected optic nerve head diameter in relation to other structures in the image.
  • the expected optic nerve head diameter is estimated from the caliber of vessels in the image or simply from the size of the image.
  • the expected nerve head diameter may be esti- mated from the camera magnification, or from the height of the image.
  • the optic nerve head diameter may be established as a standard for the specific camera setup, by measuring the optic nerve head diameter in a number of different images.
  • the search region used for finding the best candidates may have any geometric form, such as a circular region, or a rectangular region.
  • the power is preferably a value representative for the visibility of the candidate in the image.
  • the power is calculated as a measure of the candidate optic nerve head area edge, such as wherein said measure is selected from the summarized gradient, the summarized variance and/or the mean of the summarized variance, Laplace filtering, the curvature, the intensity, the skewness, the kurtosis, derived measure from Fourier transformation, derived measure from co-occurrence matrix, and derived measure from fractale dimension.
  • the power is calculated as the gradient of the candidate optic nerve head area edge.
  • the power calculated is weighted with respect to other known structures present in the image in order to reduce the risk of inadvertently assignment of a too high or too low power to the candidate optic nerve head area.
  • known structures in the image may for example be the vessels present in the image, in particular the vessel in the local region comprising the candidate optic nerve head area.
  • a known structure is the departure of image, since the border between the area outside the image (the region of interest (ROI)) and the area inside the image may represent a high gradient.
  • the selection step is a step for selecting the most probable optic nerve head area(s) among the various candidates for further validation. Therefore n best candidate optic nerve head area(s) are selected with respect to the power assigned as described above, i.e.
  • n candidate nerve head areas are the areas having the n highest powers
  • n is an integer > 1 , such as an integer in the range of from 1-100, such as an integer in the range of from 1-50, such as an integer in the range of from 1-25, such as an integer in the range of from 1-10, such as an integer in the range of from 2-25, such as an integer in the range of from 2-10, such as an integer in the range of from 3-25, such as an integer in the range of from 3-10, such as n being 1 , 2, 3, 4, 5 or 6.
  • the selected candidate optic nerve head areas are then ranked with respect to at least one validating criteria, said validating criteria may be related to the anatomical structures of the fundus region, i.e. related to the vessels as well the brightness of the optic nerve head area.
  • the validation step is conducted in order to increase the probability of the candidate optic nerve head area being the "true" optic nerve head. Although a power has been assigned previously the power may be biased by local factors in the image, and therefore not be able to rank the candidates properly.
  • the candidate optic nerve head area is preferably ranked with respect to the presence or absence of at least one of the following validating criteria:
  • any substantial sagital vessels detected extending out superior and/or inferior from the candidate optic nerve head area establishing at least one intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and/or
  • the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof,
  • a candidate cannot be ranked as a candidate if not a minimum of the criteria is fulfilled. Therefore, it may be preferred that the candidate optic nerve head area is ranked so that at least one of the criteria related to the vessels is fulfilled, and at least one of the other criteria is fulfilled, for example as ranking with respect to the presence or absence of the following validating criteria:
  • the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and/or
  • the im- age function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
  • the image function is the unsharped image, the red channel image, the green channel image, an aver- age of the green and the red channel, or any combinations thereof, and
  • the im- age function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
  • a further criterion may be added to the criteria discussed above or substituted for one of the criteria discussed above.
  • An example of such a further criterion may be
  • a candidate optic nerve head area not fulfilling at least one of the criteria is rejected as a candidate optic nerve head area, whereby it is not ranked but simply rejected.
  • the candidate optic nerve head area should preferably fulfil at least two of the criteria, otherwise the candidate is rejected as a candidate optic nerve head area.
  • the criteria mentioned above may be applied to the candidate optic nerve head area by weighting the power of the candidate optic nerve head areas with the criteria fulfilled by each candidate. Thereby a candidate area having an extraordinary high power may be ranked highest, although some of the other criteria are not fulfilled.
  • the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, or
  • the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any com- binations thereof,
  • the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and
  • the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
  • the ranking it may be possible to obtain a candidate having the highest ranking although its power is not the highest.
  • the candidate optic nerve head areas being ranked and not rejected are subjected to a selection step, wherein the highest ranking candidate optic nerve head area fulfilling the validating criteria is selected for evaluating the probability that the candidate in fact represents the "true" optic nerve head area or not.
  • the threshold may be a predetermined absolute threshold, but in a preferred embodiment the threshold is a dynamic threshold relating to the image and the structures therein. In a preferred embodiment the threshold is the power of one of the other candidates multiplied with a constant.
  • the candidate optic nerve head may be classified as the optic nerve head area, if the, optionally weighted, power of optic nerve head area is at least k times higher than the, optionally weighted, power of at least one of the other candidate optic nerve head areas, such as k being in the range of from 1.01 to 10, such as k being in the range of from 1.1 to 8 such as k being in the range of from 1.2 to 6 such as k being in the range of from 1.3 to 5 such as k being in the range of from 1.4 to 4 such as k being in the range of from 1.5 to 3 such as k being in the range of from 1.6 to 2,5 such as k being in the range of from 1.7 to 2.0, otherwise the absence of the optic nerve head in the image is assessed.
  • k being in the range of from 1.01 to 10
  • k being in the range of from 1.1 to 8
  • k being in the range of from 1.2 to 6
  • k being in the range of from 1.3 to 5
  • k being in the range of from
  • the at least one of the other candidate optic nerve head areas may be selected as the candidate optic nerve head area being ranked as No. 2, such as the candidate optic nerve head area being ranked as No. 3, such as the candidate optic nerve head area being ranked as No. 4, such as the candidate optic nerve head area being ranked as No. 5, such as the candidate optic nerve head area being ranked as No. 6, such as the candidate optic nerve head area being ranked as No. 7, such as the candidate optic nerve head area being ranked as No. 8, , such as the candidate optic nerve head area being ranked as No. 9, such as the candidate optic nerve head area being ranked as No. 10, such as the candidate optic nerve head area being ranked as No. 11 , such as the candidate optic nerve head area being ranked as No. 12, such as the candidate optic nerve head area being ranked as No.13, such as the candidate optic nerve head area being ranked as No. 14, such as the candidate optic nerve head area being ranked as No. 15, or combinations of these.
  • the absence of the optic nerve head in the image is assessed, whereby the method is concluded to be of a region of the fundus not including the optic nerve head or an optic nerve head not visible, whereby it would not disturb other processing of the image.
  • the automatic fundus coordinate system setting procedure includes three procedures, namely an optic disc detecting procedure, a fovea detecting procedure and a fundus coordinate system setting procedure.
  • an optic disc detecting procedure e.g., a laser scanner
  • a fovea detecting procedure e.g., a fundus coordinate system setting procedure
  • a fundus coordinate system setting procedure e.g., a fundus coordinate system setting procedure.
  • fovea is used synonymously with the term macula lutea.
  • the present invention relates to a method for establishing a coordinate system on an image of the ocular fundus, comprising
  • assessing the presence of the fovea region arranging a coordinate system having one axis parallel with an axis through the optic nerve head and the fovea.
  • the optic nerve head area is preferably assessed by the method described above, since this method is capable of assessing not only the presence of the optic nerve head area when present, but also the absence of the optic nerve head area when absent. Thus, no false positive detection of the optic nerve head areas is conducted which would lead to wrongly applied coordinate systems.
  • the macula lutea is a region having a radius approximately equal to twice the diameter of the optic disc around the center of the central portion of the fundus.
  • the macula lutea is made up of cones arranged in a close arrangement and constitutes a region of visual acuity and colour sense.
  • the region corresponding to the macula lutea is darker than the region surrounding the macula lutea.
  • the macula lutea has an area of a certain size and a conical shape.
  • Coordinate axes are determined on the basis of the results of detection of the optic disc and the macula lutea.
  • the abscissa axis passes the center of the optic disc and the macula lutea
  • the ordinate axis is perpendicular to the abscissa axis and passes through the center of the optic disc.
  • the distance between the macula lutea and the optic disc may be calculated in units of disc diameter.
  • the axes of the orthogonal coordinate system are inclined relative to the image. The procedure includes image data transformation to make the orthogonal coordinate system coincide with the image.
  • a curvilinear coordinate system established by combining these coordinate axes and a nerve fiber bundle distribution pattern corresponds to a fundus coordinate system.
  • coordinate axes are set such that the abscissa having reverse direction to the macula lutea is located at 0°, the upper part of the ordinate is located at 90°, the macula lutea side abscissa is located at 180°, and the lower part of the ordinate is located at 270°.
  • the angle of this coordinate axis is called the optic disc inner angle.
  • the coordinate system may be used for locating various structures and pathological condition in relation to other structures, for example for locating lesions in relation to the fovea region, since this region represents the specific vision region, and lesions close to the fovea may affect the vision, whereas lesions distanced from fovea may be of less importance prognostically.
  • the present invention further relates to a method for establishing a coordinate system on an image of the ocular fundus, comprising
  • the present invention further relates to a method for grading lesions in a fundus image, comprising establishing a coordinate system by the method as described above, and grading the lesions with respect to distance to the fovea region.
  • the optic nerve head may be used for registering the images, and accordingly, the present invention relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
  • the invention also relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
  • the identification of the optic nerve head area may also aid in detecting the vessels, since all vessels in retina either start from the optic nerve head or end at the optic nerve head, that is both the arteriolar network and the venolar network of vessels appear as trees having their roots at the optic nerve head, and may be tracked starting at the optic nerve head.
  • detection of the optic nerve head and detection for the vessels may also be an iterative process, wherein the optic nerve head is detected giving rise to a detection of the vessels, and the detection of the vessels thereby gives rise to a reiteration of the detection of the optic nerve head and so forth, until a maximum of significance for both the optic nerve head and the vessels has been met.
  • the present invention further relates to a method for detecting vessels in a fundus image, comprising
  • a very important aspect of the invention is the detection of the optional presence of the optic nerve head area in a fundus image before detection of any lesions of the fundus.
  • Lesions of the retina normally embrace microaneurysms and exudates, which show up on fundus images as generally "dot shaped" (i.e. substantially circular) areas. It is of interest to distinguish between such microaneurysms and exudates, and further to distinguish them from other pathologies in the image, such as "cotton wool spots" and hemorrhages. If the optic nerve head area is present in the image it may give rise to errors when detecting lesions in the image. Therefore, the present invention further relates to a method for assessing the presence or absence of lesions in a fundus image, comprising
  • a region around the detected optic nerve head area is masked, such as a region having a dimension corresponding to at least 1.1 times the diameter of the estimated optic nerve head area, such as at least 1.3 times the diameter of the estimated optic nerve head area, such as at least 1.5 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 1.7 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 2.0 times the diameter of the estimated optic nerve head area is masked.
  • the lesions may be detected by any suitable method known to the person skilled in the art.
  • Glaucoma causes the exfoliation of the nerve fiber layer, entailing the expansion of the recess of the optic disc on which the nerve fiber layer converges. It is known that, in most cases, a portion of the optic disc blood vessel extending in the vicinity of the optic disc edge is bent severely as the recess of the optic disc expands.
  • indicators for glaucoma may be detected by studying the nerve fiber layer and/or the vessels in the vicinity of the optic nerve head edge. Preferably both methods are applied.
  • the reflectance of the nerve fiber layer decreases with progressive exfoliation due to glaucoma or the like, and the tone of the image of the nerve fiber layer darkens.
  • specialists in glaucoma analysis are capable of identifying very delicate glaucomatous changes in the nerve fiber layer, such changes are generally difficult to detect.
  • exfoliation of the nerve fiber layer propagates along the fiber bundles, such exfoliation is called optic nerve fiber bundle defect.
  • this embodiment is capable of introducing, into image processing, medical information including the probability of occurrence of glaucomatous scotomas in the visual field.
  • the information about the nerve fiber layer can be contrasted with information about the optic disc by projecting the information about the nerve fiber layer on the information about the optic disc inner angle ⁇ . Accordingly, a comprehensive glaucoma analyzing system can be built for example as described in US Patent 5,868,134, Sugiyama, et al.
  • the image may be converted into a blue and green image having sixty-four gradations.
  • the blue and green image corresponds to a common red-free projection photograph. Defects in the nerve fiber bundles are set off by contrast and the influence of the choroid blood vessels is insignificant in the blue and green Image. Details of the image that will adversely affect the analysis are excluded.
  • the optic disc portion, the blood vessel portion and the periphery of the blood vessels may be extracted and excluded to use a retina portion for the follow- ing analysis.
  • the analysis of blood vessel curvature on the optic disc edge can be carried out by the embodiment in US Patent 5,868,134, Sugiyama, et al. wherein a blood vessel curvature VC( ⁇ ) with respect to the direction of the optic disc inner angle ( ⁇ ) as de- scribed above for the co-ordinate system, is detected for each of the optic disc edge blood vessels.
  • the blood vessel curvature VC( ⁇ ) is a curvature with respect to a direction of the midpoint C.
  • the blood vessel curvatures VC( ⁇ )determined are stored for the comprehensive analysis of glaucoma.
  • the blood vessel curvature may be conducted by measuring the length ration of the straight line between two arbitrary points on a blood vessel intersecting the optic nerve head edge to the length between the same points measured along the course of the vessel. If the ratio is close to 1 , the vessel is straight, and the closer to 0 the ratio approaches, the more bending the vessel.
  • the indicators are determined by assessing the inner and outer edge of the optic nerve head.
  • the present invention relates to a method for detecting indicators of glaucoma in a fundus image, comprising
  • the perimeter may be a generally known automatic perimeter, a flicker perimeter for glaucoma detection, a blue pyramid perimeter or a perimeter using contrast sensitivity measurement.
  • the invention further relates to a system for assessing the presence or absence of the optic nerve head in a fundus image.
  • the system according to the invention may be any system capable of conducting the method as described above as well as any combinations thereof within the scope of the inven- tion. Accordingly, the system comprises
  • n is an integer > 1 ,
  • a graphical user interface module may operate in conjunction with a display screen of a display monitor.
  • the graphical user interface may be implemented as part of the processing system to receive input data and commands from a conventional keyboard and mouse through an interface and display results on a display monitor.
  • many components of a conventional computer system have not been discussed such as address buffers, memory buffers, and other standard control circuits because these elements are well known in the art and a detailed description thereof is not necessary for understanding the present invention.
  • Pre-acquired image data can be fed directly into the processing system through a network interface and stored locally on a mass storage device and/or in a memory. Furthermore, image data may also be supplied over a network, through a portable mass storage medium such as a removable hard disk, optical disks, tape drives, or any other type of data transfer and/or storage devices which are known in the art.
  • a parallel computer platform having multiple processors is also a suitable hardware platform for use with a system according to the present invention.
  • Such a configuration may include, but not be limited to, parallel machines and workstations with multiple processors.
  • the processing system can be a single computer, or several computers can be connected through a communications network to create a logical processing system.
  • the present system allows the grader, that is the person normally grading the images to identify the optic nerve head area more rapidly and securely, if it is present in the image. Also, the present system allows an automatic detection of lesions and other pathologies of the retina without interference from the optic nerve head area, again as an aiding tool for the traditional grader.
  • the network may carry data signals including control or image adjustment signals by which the expert examining the images at the examining unit directly controls the image acquisition occurring at the recordation localisation, i.e. the acquisition unit.
  • command signals as zoom magnification, steering adjustments, and wavelength of field illumination may be selectively varied remotely to achieve desired imaging effect.
  • questionable tissue structures requiring greater magnification or a different perspective for their elucidation may be quickly resolved without ambiguity by varying such con- trol parameters.
  • by switching illumination wavelengths views may be selectively taken to represent different layers of tissue, or to accentuate imaging of the vasculature and blood flow characteristics.
  • the control signals may include time varying signals to initiate stimulation with certain wavelengths of light, to initiate im- aging at certain times after stimulation or delivery of dye or drugs, or other such precisely controlled imaging protocols.
  • the digital data signals for these operations may be interfaced to the ophthalmic equipment in a relatively straightforward fashion, provided such equipment already has initiating switches or internal digital circuitry for controlling the particular parameters involved, or is capable of readily adapting electric controls to such control parameters as system focus, illumination and the like.
  • the imaging and ophthalmic treatment in- strumentation in this case will generally include a steering and stabilization system which maintains both instruments in alignment and stabilized on the structures appearing in the field of view.
  • the invention contemplates that the system control further includes image identification and correlation software which allows the ophthalmologist at site to identify particular positions in the retinal field of view, such as pinpointing particular vessels or tissue structures, and the image acquisition computer includes image recognition software which enables it to identify patterns in the video frames and correlate the identified position with each image frame as it is acquired at the acquisition site.
  • the image recognition software may lock onto a pattern of retinal vessels.
  • the invention further contemplates that the images provided by acquisition unit are processed for photogrammetric analysis of tissue features and optionally blood flow characteristics. This may be accomplished as follows. An image acquired at the recordation unit is sent to an examination unit, where it is displayed on the screen. As indicated schematically in the figure, such image may include a network of blood vessels having various diameters and lengths. These vessels include both arterial and venous capillaries constituting the blood supply and return network.
  • the workstation is equipped with a photogrammetric measurement program which for example may enable the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the width of the vessel and the subvessels to which it is connected, as well as the coordinates thereof.
  • a photogrammetric measurement program which for example may enable the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the width of the vessel and the subvessels to which it is connected, as well as the coordinates thereof.
  • the software for noting coordinates from the pixel positions and linking displayed features in a record, as well as submodules which determine vessel capacities and the like, are straightforward and readily built up from photogrammetric program techniques.
  • Work station protocols may also be implemented to automatically map the vasculature as described above, or to compare two images taken at historically different times and identify or annotate the changes which have occurred, highlighting for the operator features such as vessel erosion, tissue which has changed colour, or other differences.
  • a user graphical interface allows the specialist to type in diagnostic indications linked to the image, or to a particular feature ap- pearing at a location in the image, so that the image or processed version of it becomes more useful.
  • the relative health of the vessel, its blood carrying capacity and the like may also be visually observed and noted.
  • This photogrammetric analy- sis allows a road map of the vasculature and its capacity to be compiled, together with annotations as to the extent of tissue health or disease apparent upon such inspection.
  • a very precise and well-annotated medical record may be readily compiled and may be compared to a previously taken view for detailed evidence of changes over a period of time, or may be compared, for example, to immediately preceding angiographic views in order to assess the actual degree of blood flow occurring therein.
  • the measurement entries at the examination unit become an annotated image record and are stored in the central library as part of the patient's record.
  • the present invention changes the dynamics of patient access to care, and the efficiency of delivery of ophthalmic expertise in a manner that solves an enormous current health care dilemma, namely, the obstacle to proper universal screening for diabetic retinopathy.
  • Possible initial locations or starting points (seed points) of the optical nerve head (ONH) are found by determining the local maximums in feature images.
  • the feature images are produced by considering the flux in the vessel tree, vertical filtering of the vessel tree, and the intensity and variance of the un-sharpened fundus image.
  • the ONH is of a circular shape.
  • a three dimensional exhaustive search is performed around the seed points, in order to de- termine the optimal position and diameter of a circle fitting the (possible) ONH. Utilizing rules regarding the structure of the vessel tree and the intensity and variance maximums of the un-sharpened image allows accepting or rejecting a possible ONH. If accepted, then the origin and diameter of the ONH is already known from the exhaustive search.
  • the fundus image in Figure 1 serves to illustrate the individual steps in the ONH detection throughout the current research report.
  • the C/C++ header that defines the input/output from the ONH detection library is shown below.
  • ROI detection in the mask image for full images the mask might be NULL.
  • the image might be a RGB image or a multi-frame image (then it is assumed the first frame (frame 0) contains the fundus image).
  • Uncertain vessels have values ]0;100[, and certain vessels have values > 100 (the actual width is given as 'pixel value minus 100').
  • ESD expectedDiskDiameter Double value of the expected disk diameter (notice it is the diameter, NOT radius).
  • EDD Expected Disk Dia- meter
  • the EDD is found as the mean ONH radius measured in a number of im- ages. The measure is done in pixels.
  • the system has predefined values for 30 and 45 degrees of fundus coverage.
  • the optic disc area is independent of age beyond and age of about 3 to 10 years.
  • the optic disc measurements vary ac- cording to the method applied.
  • Mean optic disc area of non-highly myopic Caucasians examined in various studies ranged between 2.1 mm 2 and about 2.8 mm 2 .
  • the optic disc has a slightly vertically oval form with the vertical diameter being about 7% to 10% larger than the horizontal one.
  • the area interval between 2.1 - 2.8 mm 2 give a horizontal diameter that equals 1570 - 1813 ⁇ m.
  • gray scale intensity is calculated as the mean of the red and green channel. Else it is assumed to be a gray scale image
  • the first step is applied to suppress noise.
  • the size of the median filter equals 'medianKernedSize' (Table 1 ).
  • the ONH is normally more pronounced in the red channel when having poor image quality. Therefore it is also used when converting to a gray scale image in the second step.
  • the image is reduced in size mainly since the ONH is a large feature and experi- ments show no significant performance decrease.
  • the image is reduced so the EDD becomes 'RescaledEDD' pixels (Table 1 ).
  • the rescaled rows and columns become 'if and 'cc' (Table 1 ), respectively.
  • minFluxWidth VesselThicknesSfmeanwidthFractiie'). where 'meanWidthFractile' is found in Table 1.
  • the endpoints of the thickest vessels initialize a search where the vessels are followed until the thickness drops below 'minFluxWidth' or the direction of the vessel changes too abruptly.
  • a counter at each node point in the vessel tree is increased every time the node is visited.
  • the flux seed points are defined as all the node points having a counter larger than or equal to the value of the largest count minus one ( Figure 3).
  • the large vessels near the ONH are characterized by being almost vertical. Therefore two feature images emphasizing the vertical vessels are used to guide the search for the ONH, figure 5.
  • the two images that are filtered are described in the two following sub-sections.
  • the two images do not generate identical seed points.
  • the MAT helps minimizing this problem, as the width has no influence after the MAT has been applied.
  • the result from the vessel-tracking algorithm is part of the input to the ONH algorithm.
  • the input is converted to a Boolean image where the certain vessels form the foreground.
  • the medial axis transform algorithm finds the mid-line of a structure.
  • the midpoints are defined by the center of the largest circle toughing more than one point of the border (of the object).
  • the MAT is calculated from the Boolean vessel image, figure 4.
  • the ONH is often the brightest area in the image. Gaussian filtering of the pre- filtered image is used in order to facilitate this observation.
  • the kernel size is given by 'blurKernelSize' (Table 1 ), figure 6.
  • the variance feature is undoubtedly one of the best features, if the ONH is present in the FOV. But as any other single feature, this feature may also fail. Especially, when the contrast of the image is poor.
  • a variance filter or as in this case, a standard deviation filter gives a "blocky" feature image.
  • a weighted filter solves this.
  • a low pass filter such as the Gaussian filter, can be applied afterwards. This could also have been achieved by using a weighted variance filter defined as the convolution of the original variance filter and the Gaussian filter.
  • 'max' and 'min' are the global maximum and global minimum, respectively.
  • 'val' is the value of the local maximum being tested.
  • Some areas should not be included when searching for suitable ONH candidates.
  • the following section uses the gradient in order to find the optimal placement of the ONH candidate. Obviously, high gradients between the brighter ROI and the darker surrounding area should be avoided, and pixels near the ROI border are therefore excluded from the calculation.
  • the ONH is assumed to have a circular shape leaving the center and the radius as the only unknowns, in total, three degrees of freedom.
  • the gradient orthogonal to a circle is used as cost function to find the border of the ONH ( Figure 14).
  • Figure 14b is chosen in the current implementation.
  • the number of points on the circle is 'nCircleResolution' (Table 2).
  • a 3D exhaustive search is performed in order to find the maximal gradient.
  • the size of the search area around the seed point is '2* boxHalf- Size+V with step size 'boxStep' (Table 2).
  • the last search dimension is the radius of the circle.
  • the radius of the outer circle is 'gradientCircleEnlargement' (Table 2) times the radius of the inner circle ( Figure 14).
  • avoidance mask partially avoids the (very) high false gradients. This means that the gradient is "excluded” if the end point is placed outside the mask. Notice, that "excluded” means that the gradient value is set to zero. This is neces- sary in order to avoid false high gradients and "wrong" circle powers. If the mean is used instead, the optimization has a tendency to favor placement outside the mask.
  • Seed points may be placed so there search areas overlaps, which may result in the same optimal position (and therefore also the same optimal radius). These identical results are removed and thereby producing an array of unique local optimal ONH candidates.
  • a descending sort with respect to circle power is performed on the array. (Figure 11 shows the optimized ONH candidates. The optimization was initialized from the seed points seen. The intensities of the circles are equivalent with the circle power assigned to each ONH candidate). Choose the right ONH among candidates
  • the vessel graph is used to define SVSs. Let two connected node points define a line. A confidence band is placed around this line. The width of the band is '2*confidence ⁇ mit ' (Table 3). Recursively, connected node points are included as long as they are inside the confidence band.
  • the included node points are searched to find the two node points farthest away.
  • the resulting two node points define the SVS.
  • the Euclidian distance between them defines the length of the SVS.
  • the SVS is only regarded as being a significant vessel segment if it fulfills one of two rules: If the caliber of the SVS has a caliber ⁇ or > the segment must be longer than 'long ⁇ neLength' or ' minLineLength' (Table 3), respectively (it must either be thin and long or thick and short).
  • the vicinity of the ONH candidate is searched for vessel crossings. It is a good assumption that large vessels (arcades) vertically cross the top and bottom of the peri- phery of the ONH. However, this restriction is relaxed since the vessel tracking may not be capable of finding the arcades e.g. if the image quality is degraded. It is also a common problem that the vessel tracking is not capable of tracking all the way into the ONH but stops a distance away from it. The projection of the 3D fundus to a 2D image may also introduce "mysterious" changes in the direction of the vessels.
  • an enlarged ONH radius is used when searching for crossings between the ONH periphery and the vertical vessels.
  • the search radius is enlarged by a factor 'circleEnlargeFacto (Table 3).
  • Arcades are assumed to have a caliber larger than or equal to 'minWidth', which is defined as
  • minWidth VesselWidthDistribution (v , ⁇ sS e/F rac ⁇ ).
  • a "vertical" arcade is defined to have a caliber ⁇ 'minWidth' and must cross the top or bottom quadrant with an angle less than 'maxVesselAngle '(Table 3). The angle is calculated as the angle between the vessel segment and the vector going from the center of the ONH can- didate to the point on the periphery.
  • the variance feature is also a very good feature. This means that the likelihood of having a correct identified ONH containing neither a seed point from one of these features is negligible.
  • the ONH candidates are regarded as being samples from the fundus image and thereby use them to detect the normal background "variation".
  • the ratio of circle powers between the best ONH candidate (see below for the best candidate) and the OnhCandidate2CompareWith'- ⁇ h should be above (or equal) to 'gradientPowerFactor' (Table 3):
  • OnhCandidate2CompareWith' is incremented by one, if the center of the OnhCandi- date2CompareWith'- ⁇ h ONH candidate is inside the best candidate. This is done until the center is outside the best candidate or there are no more candidates.
  • Figure 16 shows the circle power for the example fundus image.
  • topB and bottomB are two Booleans that are true if an arcade is crossing the periphery of the ONH through the top and bottom quadrant, respectively.
  • the Booleans intMaxB and varMaxB are true if an intensity and variance seed point is present inside the ONH candidate, respectively.
  • adjusted circle power is calculated as:
  • two can- didates are flagged, namely 1 ) the candidate having the highest adjust power which does not have the center point masked out in the tangential vessel image and 2) the candidate having the highest adjust power which has the center point masked out in the tangential vessel image.
  • the two flags are called OptimalGradNr' and 'optimal- MaskNf (initially the two flags equal the number of the last candidate).
  • optimalMaskNr ⁇ optimalGradNr, which means that a more powerful adjusted ONH candidate exists, then it is tested if the tangential masked candidate should be chosen instead: the center of the masked candidate should be inside the periphery of the OptimalGradNr' candidate, and satisfy a visibility criteria.
  • the ONH detection library returns with a true 'foundONH' and the center and radius of the ONH, else 'foundONH' is false.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present invention relates to a method for assessing the presence or absence of the optic nerve head in images of the ocular fundus (the back of the eye) by a robust method and system capable of validly detecting the optic nerve head when present in an image and not detecting the optic nerve head when it is missing in an image, independent of illumination, presence of diseases or artefacts in the image. The invention includes detection of candidate optic nerve head areas, assigning a power to each area, ranking the candidate areas, selecting the highest ranking area and classifying the highest ranking area with respect to a threshold as the optic nerve head area or not.

Description

Detection of optic nerve head in a fundus image
The present invention relates to a method for assessing the presence or absence of the optic nerve head in images of the ocular fundus (the back of the eye), hereinaf- ter referred to as the fundus.
Background
Diabetes is the leading cause of blindness in working age adults. It is a disease that, among its many symptoms, includes a progressive impairment of the peripheral vascular system. These changes in the vasculature of the retina cause progressive vision impairment and eventually complete loss of sight. The tragedy of diabetic reti- nopathy is that in the vast majority of cases, blindness is preventable by early diagnosis and treatment, but screening programs that could provide early detection are not widespread.
Promising techniques for early detection of diabetic retinopathy presently exist. Researchers have found that retinopathy is preceded by visibly detectable changes in blood flow through the retina. Diagnostic techniques now exist that grade and clas- sify diabetic retinopathy, and together with a series of retinal images taken at different times, these provide a methodology for the early detection of degeneration. Various medical, surgical and dietary interventions may then prevent the disease from progressing to blindness.
Despite the existing techniques for preventing diabetic blindness, only a small fraction of the afflicted population receives timely and proper care, and significant barriers separate most patients from state-of-the art diabetes eye care. There are a limited number of ophthalmologists trained to evaluate retinopathy, and most are located in population centers. Many patients cannot afford the costs or the time for travel to a specialist. Additionally, cultural and language barriers often prevent elderly, rural and ethnic minority patients from seeking proper care. Moreover, because diabetes is a persistent disease and diabetic retinopathy is a degenerative disease, an afflicted patient requires lifelong disease management, including periodic examinations to monitor and record the condition of the retina, and sustained attention on the part of the patient to medical or behavioral guidelines. Such a sustained level of personal responsibility requires a high degree of motivation, and lifelong disease management can be a significant lifestyle burden. These factors increase the likelihood that the patient will, at least at some point, fail to receive proper disease management, often with catastrophic consequences.
Accordingly, it would be desirable to implement more widespread screening for retinal degeneration or pathology, and to positively address the financial, social and cultural barriers to implementation of such screening. It would also be desirable to improve the efficiency and quality of retinal evaluation.
Hence, a precise knowledge of both localisation and orientations of the structures of the fundus is important, including the localisation of the optic nerve head. Currently, examination of fundus images is carried out principally by a clinician examining each image "manually". This is not only very time-consuming, since even an experienced clinician can take several minutes to assess a single image, but is also prone to error since there can be inconsistencies between the way in which different clinicians assess a given image.
It is therefore desirable to provide ways of automating the process of the analysis of fundus images, using computerised image analysis, so as to provide at least preliminary screening information and also as an aid to diagnosis to assist the clinician in the analysis of difficult cases.
Next, it is generally desirable to provide a method of determining accurately, using computerised image analysis techniques, the position of both the optic nerve head (the point of exit of the optic nerve) and the fovea (the region at the centre of the retina, where the retina is most sensitive to light), as well vessels of the fundus.
Summary of the invention
The present invention relates to a method for assessing the presence or absence of the optic nerve head in fundus images. In order to be able to make automatic detection of various structures in fundus images a reliable method of detecting the optic nerve head in fundus images actually containing the optic nerve head, and reliably not detecting the optic nerve head in images taken, so that the optic nerve head is absent in that image. Current methods may be able to detect the optic nerve head in many normal images, but the methods are not reliable when applied to images not containing the optic nerve. Furthermore, the method should be robust in the sense that it should be applicable to a wide variety of images independent of illumination, presence of symptoms of diseases and/or artefacts of the image.
Accordingly, the present invention provides a method, comprising
a) establishing n candidate optic nerve head area(s), wherein n is an integer > 1 ,
b) ranking the n selected candidate optic nerve head area(s) with respect to at least one validating criteria,
c) selecting the highest ranking candidate optic nerve head area fulfilling the vali- dating criteria,
d) classifying the candidate optic nerve head area selected in step c) with respect to a threshold as the optic nerve head area or not.
In a preferred embodiment the present invention provides a method, comprising
a) establishing at least one starting point representative for an optic nerve head,
b) for each starting point initiating a search for the best candidate optic nerve head area, assigning a power to the candidate optic nerve head area ,
c) selecting n best candidate optic nerve head area(s) with respect to the power assigned in step b), wherein n is an integer > 1 ,
d) ranking the n selected candidate optic nerve head area(s) with respect to at least one validating criteria,
e) selecting the highest ranking candidate optic nerve head area fulfilling the validating criteria, f) classifying the candidate optic nerve head area selected in step e) with respect to a threshold as the optic nerve head area or not.
In a preferred embodiment n is a small number, such as less than 15, more prefera- bly less than 10, more preferably less than 5. Normally n is 2, 3, or 4, most preferably 4.
Furthermore, the invention relates to a system capable of conducting the method, such as a system for assessing the presence or absence of the optic nerve head in a fundus image, comprising
a) an algorithm point establishing n candidate optic nerve head area(s), wherein n is an integer > 1 ,
b) an algorithm for ranking the n selected candidate optic nerve head area(s) with respect to at least one validating criteria,
c) an algorithm for selecting the highest ranking candidate optic nerve head area fulfilling the validating criteria,
d) an algorithm for classifying the candidate optic nerve head area selected in e) with respect to a threshold as the optic nerve head area or not.
The method according to the invention may be applied in several procedures for identifying structures or indications of diseases or abnormal conditions.
In one aspect the invention relates to the use of the method of assessing the optic nerve head for establishing a coordinate system on an image of the ocular fundus, comprising
assessing the presence of the optic nerve head area by a method as defined above,
assessing the presence of the fovea region, arranging a coordinate system having one axis parallel with an axis through the optic nerve head and the fovea.
Such a coordinate system allows a precise location of other structures in the fundus image, thereby providing more exact diagnosis, for example by establishing a coordinate system by the method, and grading the lesions with respect to distance to the fovea region.
The localisation of for example lesions with respect to the fovea region may also be accomplished by another method according to the invention for establishing a coordinate system on an image of the ocular fundus, comprising
assessing the presence of the optic nerve head area by a method as defined above, assessing the optic nerve head diameter,
assessing the presence of the fovea region,
arranging a coordinate system based on distance from the fovea, such as number of optic nerve head diameters from fovea.
When recording fundus images, at least 2 images are normally recorded from each fundus, representing different regions of the fundus. In order to examine the fundus region properly, registering or mounting of the images in a continuous manner with respect to the structures in the image, such as for example by arranging the images so that the vessels correctly continue in the images. Also, the optic nerve head may be used for registering the images, and accordingly, the present invention relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
Also, the detection of optic nerve head may be used for detecting vessels in the image, optionally as an iterative method. Thus, the present invention relates to a method for detecting vessels in a fundus image, comprising a) estimating the localisation of the optic nerve head region by a method as defined above,
b) estimating vessels based on the localisation of the optic nerve head area,
c) optionally repeating steps a) and b) at least once.
An important aspect of the present invention is the application of the method in a method for assessing the presence or absence of lesions in a fundus image. In such an application the detection of the optic nerve head is used to for example mask the optic nerve head area in order to avoid false positive lesions very likely detected in the neighbourhood of the optic nerve head. Thus, the invention also relates to a method for assessing the presence or absence of lesions in a fundus image, comprising
a) estimating the localisation of the optic nerve head area in the image if present, masking a region comprising the optic nerve head area, and
b) estimating the presence or absence of lesions in the remaining image.
Also, detection of the optic nerve head area may allow detection of indicators of diseases such as glaucoma. Accordingly, the invention relates to a method for detecting indicators of glaucoma in a fundus image, comprising
a) estimating the localisation of the optic nerve head area in the image if present, and
b) detecting indicators of glaucoma related to the optic nerve head area.
Drawings
Fig. 1 is a fundus image.
Fig. 2 is an unsharp filtered image of the fundus of Fig. 1. Fig. 3 is an image of scoring of branching points in the vessel tree of the fundus of Fig. 1. Starting points are established at highest scoring branching points.
Fig. 4 is the fundus image of Fig. 1 wherein a filter enhancing sagital structures is applied. The vessels have all been denoted 1 pixel in width (Medial Axis Transformation - MAT). Starting points are established in maxima of the image.
Fig. 5 is the fundus image of Fig. 1 wherein a filter enhancing sagital structures is applied. Starting points are established in maxima of the image.
Fig. 6 is an image of the fundus in Fig. 1 , wherein intensity maximum has given rise to a starting point.
Fig. 7 is an image of the fundus in Fig. 1, wherein variance maximum of the image has given rise to a starting point.
Fig. 8 is an image of the fundus in Fig. 1 , wherein the vessels have been masked.
Fig. 9 is an image of the fundus in Fig. 1, wherein the position of all starting points detected by the various methods is shown.
Fig. 10 is an image showing excluding areas of the fundus of Fig. 1 based on tangential vessel exclusion.
Fig. 11 shows the optimal circles found from the starting points, and the power assigned to the circles.
Fig. 12 shows the circle being accepted as the optic nerve head.
Fig. 13 shows 4 different fundus images, wherein the optic nerve head is present in two of the images, and absent in the remaining images.
Figure 14. The gradient orthogonal to a circle is used as cost function to find the border of the ONH. In (a) the cost function is computed as the sum of individual gradients, where the gradient is computed as the difference between the intensity value at the end- and start-point. In (b) the cost is computed as the difference between the sum of the intensity at outer- and inner-circle-points, (a) and (b) are equal (as long as the computations are linear).
Figure 15. The figure shows the correspondence between the circle power (sums of the intensities at the 'nCircleResolution' circle-points) and the radial distance from the centre of the ONH. The optimal radius is found at the highest gradient.
Figure 16. The circle power seen visually in Figure 11 here plotted as a graph.
Figure 17. In general, no vessels near the ONH will be tangential to the ONH.
Figure 18. The schematic drawing shows the tangential vessel exclusion algorithm. An exclusion area is drawn around all significant vessels in the fundus image. The ONH is more likely to be found outside the tangential exclusion areas.
Figure 19. The figure depicts how the neighborhood around an initial vessel segment is expanded inside the confidence band. The endpoints of the straight vessels segment are the expanded node points, which are farthest away.
Figure 20. The figure shows an ONH candidate and the crossing of an arcade (vessel segment).
Definitions
Image: The term image is used to describe a representation of the region to be examined, i.e. the term image includes 1 -dimensional representations, 2-dimensional representations, 3-dimensionals representations as well as n-dimensional. Thus, the term image includes a volume of the region, a matrix of the region as well as an array of information of the region.
Representative for an optic nerve head: The term representative means that the starting point may represent a point or an area of the optic nerve head.
ROI: Region of interest. Visibility: The term visibility is used in the normal meaning of the word, i.e. how visible a lesion or a structure of the fundus region is compared to background and other structures/lesions.
Optic nerve head (ONH): The term is used in its normal anatomical meaning, i.e. the area in the fundus of the eye where the optic nerve enters the retina. Synonyms for the area are, for example, the "blind" spot, the papilla, or the optic disk.
Fovea: The term is used in its normal anatomical meaning, i.e. the spot in retina having a great concentration of cones giving rise to the vision. Fovea and the term
"macula lutea" are used as synonyms. Also fovea has an increased pigmentation.
Red-green-blue image: The term relates to the image having the red channel, the green channel and the blue channel, also called the RBG image.
Starting point: The term describes a point or area for starting the search for candidate optic nerve head areas. The term starting point is thus not limited to a mathematical point, such as not limited to a pixel, but merely denotes a localisation for starting a search.
Detailed description of the invention
Images
The images of the present invention may be any sort of images and presentations of the fundus. In one embodiment the image is presented on a medium selected from dias, paper photos or digital photos. However, the image may be any other kind of representation, such as a presentation on an array of elements, for example a CCD.
The image may be a grey-toned image or a colour image; in a preferred embodiment the image is a colour image.
In a preferred embodiment the green and/or the red channel is used for assaying the presence or absence of the optic nerve head area, and more preferred an average of the green and the red channel is used. Establishing candidate optic nerve head areas
The candidate optic nerve head area(s) may be detected by any suitable method, for example by filtering, by template matching, by establishing starting points, and from said starting points grow regions and/or by other methods search for candidate areas, and/or combinations thereof. In a preferred embodiment the candidate optic nerve head area(s) are detected by establishing starting points, and from the starting points searching for candidate areas.
Establishing starting points
The starting points may be established by a variety of suitable methods and of combinations of such methods. In a preferred embodiment the establishment of starting points is conducted by applying the same approach as the present inventors have experienced that the human eye uses for identifying the optic nerve head when examining the fundus image manually. This approach is principally a combination of following the vessels in the image and looking for a bright area.
The image may be filtered and/or blurred before establishing or as a part of establishing starting points for the method. For example the low frequencies of the image may be removed before establishing starting points. Also, the image may be un- sharp filtered, for example by median or mean filtering the image and subtracting the filtered result from the image.
Independent of whether the image is filtered or not the starting points may be established as extrema of the image. Preferably the image is however a filtered image, wherein the filtering may be linear and/or non-linear.
In one embodiment the filtering method is a filtering method using templates, wherein the template may exhibit any suitable geometry for identifying the optic nerve head. Examples templates are circular templates having a diameter of the expected optic nerve head ± some percent, for example + 30 %. It is within the scope of the invention, that the image may be filtered with one or more filters before establishing starting points, or as a part of the step of establishing starting points. Thus, in one embodiment of the invention starting points are established by combining two or more filters.
The extrema may thus be identified individually by one or more of several methods, such as the following:
Searching for the part of the vessels where no further branching points is detected, that is establishing at least one extremum in the image based on vessel branching points, preferably establishing at least one maximum in the image based on vessel branching points.
Blood vessels branch at the optic disc in directions substantially perpendicular to a transverse direction, for example a direction from the optic disc toward the macula lutea. The blood vessels branching perpendicularly at the optic nerve head area may be designated longitudinal blood vessels or sagitally oriented vessels, and the optic nerve head may be detected by searching for the longitudinal vessels. Thus, one method may be establishing at least one extremum in the image based on a filter enhancing sagital structures, preferably establishing at least one maximum in the image based on a filter enhancing sagital structures. The sagital filtering may be conducted on the vessels present in the image without any transformation. However, the sagital filtering may additionally or only be conducted on a thinned vessel image, for example wherein the vessels are all thinned to one pixel.
Tracking vessels
Various methods are known by which the vascular system may be isolated from the rest of the image context and skeletonized, i.e. the network of the vessels identified.
One method for tracking vessels is may be to extract linear components that may be regarded as those of the blood vessels. That is, groups of pixels forming bright lines on the dark background are considered to be images of the blood vessels. Another method for tracking vessels is a method wherein use is made of the fact that the vessels are linear in a local neighbourhood wherein different filter matrices have different orientations. The localisation and orientation of such line elements may be determined using a template matching approach sometimes referred to as match filters).
Other methods for tracking vessels known to the person skilled in the art may be found in
Subhasis Chaudhuri et al, "Detection of Blood Vessels in Retinal Images Using Two- Dimensional Matched Filters", IEEE Transactions on Medical Imaging, Vol. 8, No. 3, September 1989.
Tolias y a et al "A fuzzy vessel tracking algorithm for retinal images based on fuzzy clustering, IEEE Transactions on Medical Imaging, April 1998, IEEE; USA vol. 17, No. 2, pages 263-273, ISSN: 0278-0062
Akita et al: "A computer method of understanding ocular fundus images" Pattern Recognition, 1982, UK, vol. 15, No. 6, pages 431-443, ISSN: 0031-3203 chapter 4.
A preferred method for tracking vessels is by tracking individual vessels from starting points representative for vessels, and iteratively grow the vessel network of the retina. A preferred embodiment hereof is described in a co-pending PCT patent application entitled "Vessel tracking" of RETINALYZE A/S.
In some of the embodiments according to the invention, it is preferred that the estimation of candidate optic nerve head areas is adjusted with respect to vessels appearing in the image. By adjusted is meant either that an iterative estimation of optic nerve head and vessels is conducted, wherein for each iteration, the significance of the localisation of both increases towards a maximum, or that knowledge of the anatomical localisation of vessels adjacent the optic nerve head is used for locating and/or validating the position of the optic nerve head. For many of these embodiments it is even more preferred that the estimation of candidate optic nerve head areas is preceded by detection of vessels in the image. Having identified the blood vessels in the image, it is desirable to be able to distinguish between veins and arteries among the blood vessels. This can be important, for example in the diagnosis of venous beading and focal arteriolar narrowing.
The vascular system observed in the ocular fundus images is by nature a 2-dimen- sional projection of a 3-dimensional structure. It is quite difficult in principle to distinguish veins from arteries, solely by looking at isolated vessel segments. However, it has been discovered that effective separation can be achieved by making use of the fact that, individually, the artery structure and the vein vessel structures is each a perfect tree (i.e., there is one unique path along the vessels from the heart to each capillary and back).
On the retina, the artery and vein structures are each surface filling, so that all tissue is either supplied or drained by specific arteries or veins, respectively.
A method for distinguishing veins from arteries is described in WO 00/65982 to Tor- sana Diabetes Diagnostic A/S and is based on the realisation that crossings of vessel segments are, for practical purposes, always between a vein and an artery (i.e. crossings between arteries and arteries or between veins and veins are, for practical purposes, non-existent).
It is preferred that independent arterioles and venoles have been identified before establishing extremas based on vessel branching points as discussed above.
Brightness
The optic nerve head will normally be one of the brightest areas in the image, or at least locally the brightest area. Thus, a method may be establishing at least one intensity extremum in the image, preferably at least one intensity maximum. In this regard causes for failure is for example the very normal shadow in the region of the optic nerve head, wherein the shadow is caused by the nose of the person having his or her fundus examined. Therefore, in a preferred embodiment at least one local intensity maximum is established. The extrema may be established on any image function, such as wherein the image function is the unsharped image, the red chan- nel image, the green channel image, an average of the green and the red channel or any combinations thereof.
Instead of using intensity or in addition to using intensity the method may include establishing at least one variance extremum in the image, preferably establishing at least one variance maximum in the image. For the same reasons as described with respect to the intensity at least one local variance maximum is established. The ex- trema may be established on any image function, such as wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel or any combinations thereof.
In a preferred embodiment the variance extremum is a weighted variance maximum, or a local variance maximum, more preferably a local weighted variance maximum.
Another method for establishing starting points may be random establishment of starting points, wherein the ultimate random establishment, is establishing a starting point in substantially each pixel of the image. Of course a random establishment may be combined with any of the methods discussed above.
In yet a further embodiment the starting points may be established as grid points, such as evenly distributed or unevenly distributed grid points. Again this method may be combined with any of the methods of establishing extrema in the image and/or random establishment.
In a preferred method, starting points are established by more than one of the methods described in order to increase the probability of assessing the correct localisation of the optic nerve head when present in the image, also with respect to images having less optimal illumination or presenting other forms of less optimal image quality. A problem that increases when fundus images are recorded decentrally and by less experienced staff than what is the case at specialised hospital departments.
Thus, in a preferred embodiment starting points are established by at least two of the steps or methods described above, such as by at least three of the steps or methods described above, such as by at least four of the steps or methods de- scribed above, such as at least five of the steps or methods described above. Search for best candidate
From the starting points established a search in a local region around the starting point for a best candidate optic nerve head is initiated. The search may be for a geometric shape representative for the optic nerve head. For example the best candidate optic nerve head area may be represented by the periphery of the area, such as a closed curve, such as a circle, an ellipse, a snake, a polygonal, wherein the latter represent an irregular geometrical form. In another embodiment the best candidate optic nerve head area is represented as an open curve, such as a circle, an ellipse, a snake, a polygonal. By the latter method it is possible to detect the optic nerve head although a part of the optic nerve head is outside the image.
In yet another embodiment the best candidate optic nerve head area may be repre- sented by the area as such, for example represented as a closed area, such as a circle, an ellipse, a snake, a polygonal.
In a preferred embodiment the search initiated in step b) is a search for a centre of a best matching circle, wherein the centre is positioned in a search region of a predetermined size established around each starting point. In this embodiment the centre of the best matching circle is not necessarily the starting point, but rather that the centre is positioned within the search region around the starting point. Normally the search for best matching circles is conducted by using searching for a variety of radii in the search region. Typically the radius of the best matching circle n the range of + 100 % of the expected diameter of the optic nerve head, such as n the range of + 90 % of the expected diameter of the optic nerve head, such as n the range of ± 80 % of the expected diameter of the optic nerve head, such as n the range of + 70 % of the expected diameter of the optic nerve head, such as n the range of ± 60 % of the expected diameter of the optic nerve head, such as n the range of + 50 % of the expected diameter of the optic nerve head, such as n the range of ± 40 % of the expected diameter of the optic nerve head, such as n the range of + 35 % of the expected diameter of the optic nerve head, such as n the range of + 30 % of the expected diameter of the optic nerve head, such as n the range of ± 25 % of the expected diameter of the optic nerve head, such as n the range of ± 20 % of the expected diameter of the optic nerve head, such as n the range of + 15 % of the expected diameter of the optic nerve head, such as in the range of + 10 % of the expected diameter of the optic nerve head.
The expected optic nerve head diameter may be predetermined as an absolute fig- ure, however in order to apply the method to a variety of images taken with different resolutions etc. it is more appropriate to estimate the expected optic nerve head diameter in relation to other structures in the image. For example the expected optic nerve head diameter is estimated from the caliber of vessels in the image or simply from the size of the image. Also, the expected nerve head diameter may be esti- mated from the camera magnification, or from the height of the image. Furthermore, the optic nerve head diameter may be established as a standard for the specific camera setup, by measuring the optic nerve head diameter in a number of different images.
Furthermore, the search region used for finding the best candidates may have any geometric form, such as a circular region, or a rectangular region.
Assigning a power
For each of the best candidate optic nerve head areas a power is assigned to the candidate. The power is preferably a value representative for the visibility of the candidate in the image.
In a preferred embodiment the power is calculated as a measure of the candidate optic nerve head area edge, such as wherein said measure is selected from the summarized gradient, the summarized variance and/or the mean of the summarized variance, Laplace filtering, the curvature, the intensity, the skewness, the kurtosis, derived measure from Fourier transformation, derived measure from co-occurrence matrix, and derived measure from fractale dimension. In a preferred embodiment the power is calculated as the gradient of the candidate optic nerve head area edge.
Furthermore, it is preferred that the power calculated is weighted with respect to other known structures present in the image in order to reduce the risk of inadvertently assignment of a too high or too low power to the candidate optic nerve head area. Known structures in the image may for example be the vessels present in the image, in particular the vessel in the local region comprising the candidate optic nerve head area. Also a known structure is the departure of image, since the border between the area outside the image (the region of interest (ROI)) and the area inside the image may represent a high gradient.
Selecting
Normally several candidate optic nerve head regions have been identified in the image, and the selection step is a step for selecting the most probable optic nerve head area(s) among the various candidates for further validation. Therefore n best candidate optic nerve head area(s) are selected with respect to the power assigned as described above, i.e. the n candidate nerve head areas are the areas having the n highest powers, n is an integer > 1 , such as an integer in the range of from 1-100, such as an integer in the range of from 1-50, such as an integer in the range of from 1-25, such as an integer in the range of from 1-10, such as an integer in the range of from 2-25, such as an integer in the range of from 2-10, such as an integer in the range of from 3-25, such as an integer in the range of from 3-10, such as n being 1 , 2, 3, 4, 5 or 6.
Validating criteria
When n > 2, the selected candidate optic nerve head areas are then ranked with respect to at least one validating criteria, said validating criteria may be related to the anatomical structures of the fundus region, i.e. related to the vessels as well the brightness of the optic nerve head area. The validation step is conducted in order to increase the probability of the candidate optic nerve head area being the "true" optic nerve head. Although a power has been assigned previously the power may be biased by local factors in the image, and therefore not be able to rank the candidates properly.
Accordingly, the candidate optic nerve head area is preferably ranked with respect to the presence or absence of at least one of the following validating criteria:
any substantial sagital vessels detected extending out superior and/or inferior from the candidate optic nerve head area, establishing at least one intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and/or
establishing at least one variance extremum in the image, preferably establishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof,
no vessels tangential to the candidate optic nerve head area.
The more of the criteria fulfilled by the candidate the higher ranking. In a preferred embodiment a candidate cannot be ranked as a candidate if not a minimum of the criteria is fulfilled. Therefore, it may be preferred that the candidate optic nerve head area is ranked so that at least one of the criteria related to the vessels is fulfilled, and at least one of the other criteria is fulfilled, for example as ranking with respect to the presence or absence of the following validating criteria:
any substantial sagital vessels detected extending out superior and/or inferior from the candidate optic nerve head area, and
establishing at least one intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and/or
establishing at least one variance extremum in the image, preferably establishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the im- age function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
A more reliable method is obtained when the candidate optic nerve head area is ranked with respect to the presence or absence of the following validating criteria:
substantial sagital vessels detected extending out above and out below from the candidate optic nerve head area, and
establishing at least one intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an aver- age of the green and the red channel, or any combinations thereof, and
establishing at least one variance extremum in the image, preferably establishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the im- age function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
A further criterion may be added to the criteria discussed above or substituted for one of the criteria discussed above. An example of such a further criterion may be
no vessels tangential to the candidate optic nerve head area.
By this criterion, all candidate optic nerve head areas localised in a region along vessels in the image are rejected as a candidate, since there are no vessels tangential to the optic nerve head.
Generally, a candidate optic nerve head area not fulfilling at least one of the criteria is rejected as a candidate optic nerve head area, whereby it is not ranked but simply rejected. For a more reliable method, the candidate optic nerve head area should preferably fulfil at least two of the criteria, otherwise the candidate is rejected as a candidate optic nerve head area. The criteria mentioned above may be applied to the candidate optic nerve head area by weighting the power of the candidate optic nerve head areas with the criteria fulfilled by each candidate. Thereby a candidate area having an extraordinary high power may be ranked highest, although some of the other criteria are not fulfilled.
In a very preferred embodiment a candidate optic nerve head area is rejected if not fulfilling one of the following combination of criteria:
substantial sagital vessels detected extending out superior and inferior from the candidate optic nerve head area, and
establishing at least one intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, or
establishing at least one variance extremum in the image, preferably establishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any com- binations thereof,
or
substantial sagital vessels detected extending out superior or inferior from the candidate optic nerve head area, and
establishing at least one intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and
establishing at least one variance extremum in the image, preferably es- tablishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
By the ranking it may be possible to obtain a candidate having the highest ranking although its power is not the highest. After the ranking step the candidate optic nerve head areas being ranked and not rejected are subjected to a selection step, wherein the highest ranking candidate optic nerve head area fulfilling the validating criteria is selected for evaluating the probability that the candidate in fact represents the "true" optic nerve head area or not.
This is conducted by classifying the candidate optic nerve head area selected above with respect to a threshold as the optic nerve head area or not. The threshold may be a predetermined absolute threshold, but in a preferred embodiment the threshold is a dynamic threshold relating to the image and the structures therein. In a preferred embodiment the threshold is the power of one of the other candidates multiplied with a constant. Thus, the candidate optic nerve head may be classified as the optic nerve head area, if the, optionally weighted, power of optic nerve head area is at least k times higher than the, optionally weighted, power of at least one of the other candidate optic nerve head areas, such as k being in the range of from 1.01 to 10, such as k being in the range of from 1.1 to 8 such as k being in the range of from 1.2 to 6 such as k being in the range of from 1.3 to 5 such as k being in the range of from 1.4 to 4 such as k being in the range of from 1.5 to 3 such as k being in the range of from 1.6 to 2,5 such as k being in the range of from 1.7 to 2.0, otherwise the absence of the optic nerve head in the image is assessed.
The at least one of the other candidate optic nerve head areas may be selected as the candidate optic nerve head area being ranked as No. 2, such as the candidate optic nerve head area being ranked as No. 3, such as the candidate optic nerve head area being ranked as No. 4, such as the candidate optic nerve head area being ranked as No. 5, such as the candidate optic nerve head area being ranked as No. 6, such as the candidate optic nerve head area being ranked as No. 7, such as the candidate optic nerve head area being ranked as No. 8, , such as the candidate optic nerve head area being ranked as No. 9, such as the candidate optic nerve head area being ranked as No. 10, such as the candidate optic nerve head area being ranked as No. 11 , such as the candidate optic nerve head area being ranked as No. 12, such as the candidate optic nerve head area being ranked as No.13, such as the candidate optic nerve head area being ranked as No. 14, such as the candidate optic nerve head area being ranked as No. 15, or combinations of these.
In case the power of the candidate optic nerve head area classified is below this threshold, the absence of the optic nerve head in the image is assessed, whereby the method is concluded to be of a region of the fundus not including the optic nerve head or an optic nerve head not visible, whereby it would not disturb other processing of the image.
Applications
In the following examples of various applications of the method according to the invention are discussed.
Automatic Fundus Coordinate System Setting Procedure
The automatic fundus coordinate system setting procedure includes three procedures, namely an optic disc detecting procedure, a fovea detecting procedure and a fundus coordinate system setting procedure. In the present context the term fovea is used synonymously with the term macula lutea.
Accordingly, the present invention relates to a method for establishing a coordinate system on an image of the ocular fundus, comprising
assessing the presence of the optic nerve head area,
assessing the presence of the fovea region, arranging a coordinate system having one axis parallel with an axis through the optic nerve head and the fovea.
The optic nerve head area is preferably assessed by the method described above, since this method is capable of assessing not only the presence of the optic nerve head area when present, but also the absence of the optic nerve head area when absent. Thus, no false positive detection of the optic nerve head areas is conducted which would lead to wrongly applied coordinate systems.
The macula lutea is a region having a radius approximately equal to twice the diameter of the optic disc around the center of the central portion of the fundus. The macula lutea is made up of cones arranged in a close arrangement and constitutes a region of visual acuity and colour sense. The region corresponding to the macula lutea is darker than the region surrounding the macula lutea. The macula lutea has an area of a certain size and a conical shape. These features may be used for detecting the macula lutea. The macula lutea may be detected as described in United States Patent 5,868,134 (Sugiyama) or WO 00/65982 (Torsana Diabetes Diagnostics A/S).
Coordinate axes are determined on the basis of the results of detection of the optic disc and the macula lutea. In an orthogonal coordinate system, the abscissa axis passes the center of the optic disc and the macula lutea, and the ordinate axis is perpendicular to the abscissa axis and passes through the center of the optic disc. In this orthogonal coordinate system, the distance between the macula lutea and the optic disc may be calculated in units of disc diameter. In most cases, the axes of the orthogonal coordinate system are inclined relative to the image. The procedure includes image data transformation to make the orthogonal coordinate system coincide with the image.
A curvilinear coordinate system established by combining these coordinate axes and a nerve fiber bundle distribution pattern, corresponds to a fundus coordinate system. Based on the centre of the optic disc, coordinate axes are set such that the abscissa having reverse direction to the macula lutea is located at 0°, the upper part of the ordinate is located at 90°, the macula lutea side abscissa is located at 180°, and the lower part of the ordinate is located at 270°.
In the present specification, the angle of this coordinate axis is called the optic disc inner angle.
The coordinate system may be used for locating various structures and pathological condition in relation to other structures, for example for locating lesions in relation to the fovea region, since this region represents the specific vision region, and lesions close to the fovea may affect the vision, whereas lesions distanced from fovea may be of less importance prognostically.
Accordingly, the present invention further relates to a method for establishing a coordinate system on an image of the ocular fundus, comprising
assessing the presence of the optic nerve head area by a method as defined above, assessing the optic nerve head diameter,
assessing the presence of the fovea region,
arranging a coordinate system based on distance from the fovea, such as number of optic nerve head diameters from fovea.
Also, the present invention further relates to a method for grading lesions in a fundus image, comprising establishing a coordinate system by the method as described above, and grading the lesions with respect to distance to the fovea region.
When recording fundus images, at least 2 images are normally recorded from each fundus, representing different regions of the fundus. A golden standard for fundus images is recordation of 7 regions, all overlapping partly at least one of the other. In order to examine the fundus region properly, registering or mounting of the images in a continuous manner with respect to the structures in the image, such as for ex- ample by arranging the images so that the vessels correctly continue in the images. Also, the optic nerve head may be used for registering the images, and accordingly, the present invention relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
Accordingly, the invention also relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
Iterative detection of vessels
Various methods for detecting vessels in a fundus image have been described above. However, the identification of the optic nerve head area may also aid in detecting the vessels, since all vessels in retina either start from the optic nerve head or end at the optic nerve head, that is both the arteriolar network and the venolar network of vessels appear as trees having their roots at the optic nerve head, and may be tracked starting at the optic nerve head. However, detection of the optic nerve head and detection for the vessels may also be an iterative process, wherein the optic nerve head is detected giving rise to a detection of the vessels, and the detection of the vessels thereby gives rise to a reiteration of the detection of the optic nerve head and so forth, until a maximum of significance for both the optic nerve head and the vessels has been met.
Accordingly, the present invention further relates to a method for detecting vessels in a fundus image, comprising
a) estimating the localisation of the optic nerve head region by a method as de- fined above,
b) estimating vessels based on the localisation of the optic nerve head area,
c) optionally repeating steps a) and b) at least once. Detection of lesions
A very important aspect of the invention is the detection of the optional presence of the optic nerve head area in a fundus image before detection of any lesions of the fundus. Lesions of the retina normally embrace microaneurysms and exudates, which show up on fundus images as generally "dot shaped" (i.e. substantially circular) areas. It is of interest to distinguish between such microaneurysms and exudates, and further to distinguish them from other pathologies in the image, such as "cotton wool spots" and hemorrhages. If the optic nerve head area is present in the image it may give rise to errors when detecting lesions in the image. Therefore, the present invention further relates to a method for assessing the presence or absence of lesions in a fundus image, comprising
a) estimating the localisation of the optic nerve head area in the image if present, masking a region comprising the optic nerve head area, and
b) estimating the presence or absence of lesions in the remaining image.
Normally a region around the detected optic nerve head area is masked, such as a region having a dimension corresponding to at least 1.1 times the diameter of the estimated optic nerve head area, such as at least 1.3 times the diameter of the estimated optic nerve head area, such as at least 1.5 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 1.7 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 2.0 times the diameter of the estimated optic nerve head area is masked.
The lesions may be detected by any suitable method known to the person skilled in the art.
A preferred method is described in a co-pending PCT application entitled "Lesion detection in fundus images" by RETINALYZE A/S Glaucoma
Glaucoma causes the exfoliation of the nerve fiber layer, entailing the expansion of the recess of the optic disc on which the nerve fiber layer converges. It is known that, in most cases, a portion of the optic disc blood vessel extending in the vicinity of the optic disc edge is bent severely as the recess of the optic disc expands.
Thus, indicators for glaucoma may be detected by studying the nerve fiber layer and/or the vessels in the vicinity of the optic nerve head edge. Preferably both methods are applied.
The reflectance of the nerve fiber layer decreases with progressive exfoliation due to glaucoma or the like, and the tone of the image of the nerve fiber layer darkens. Although specialists in glaucoma analysis are capable of identifying very delicate glaucomatous changes in the nerve fiber layer, such changes are generally difficult to detect. As exfoliation of the nerve fiber layer propagates along the fiber bundles, such exfoliation is called optic nerve fiber bundle defect.
Several methods for detecting optic nerve fiber bundles are known to the person skilled in the art, for example by searching the retinal image along radial lines radiating from the center of the optic disc. By another method directly searching the arrangement of the optic nerve fibers and of detecting defects in the nerve fiber bundles with a high sensitivity and without being disturbed by noise is possible. Furthermore, this embodiment is capable of introducing, into image processing, medical information including the probability of occurrence of glaucomatous scotomas in the visual field. The information about the nerve fiber layer can be contrasted with information about the optic disc by projecting the information about the nerve fiber layer on the information about the optic disc inner angle θ. Accordingly, a comprehensive glaucoma analyzing system can be built for example as described in US Patent 5,868,134, Sugiyama, et al.
The image may be converted into a blue and green image having sixty-four gradations. The blue and green image corresponds to a common red-free projection photograph. Defects in the nerve fiber bundles are set off by contrast and the influence of the choroid blood vessels is insignificant in the blue and green Image. Details of the image that will adversely affect the analysis are excluded. In this embodiment, the optic disc portion, the blood vessel portion and the periphery of the blood vessels may be extracted and excluded to use a retina portion for the follow- ing analysis.
The analysis of blood vessel curvature on the optic disc edge can be carried out by the embodiment in US Patent 5,868,134, Sugiyama, et al. wherein a blood vessel curvature VC(Θ) with respect to the direction of the optic disc inner angle (θ) as de- scribed above for the co-ordinate system, is detected for each of the optic disc edge blood vessels. The blood vessel curvature VC(θ)is a curvature with respect to a direction of the midpoint C. The blood vessel curvatures VC(θ)determined are stored for the comprehensive analysis of glaucoma. In another embodiment the blood vessel curvature may be conducted by measuring the length ration of the straight line between two arbitrary points on a blood vessel intersecting the optic nerve head edge to the length between the same points measured along the course of the vessel. If the ratio is close to 1 , the vessel is straight, and the closer to 0 the ratio approaches, the more bending the vessel.
In yet another embodiment the indicators are determined by assessing the inner and outer edge of the optic nerve head.
Thus, in an aspect the present invention relates to a method for detecting indicators of glaucoma in a fundus image, comprising
a) estimating the localisation of the optic nerve head area in the image if present, and
b) detecting indicators of glaucoma related to the optic nerve head area.
Even in a case where an ophthalmologist is unable to recognize the condition of glaucoma by observation of the visual field and the fundus, the symptom of glaucoma represented by defects in the nerve fiber bundles can be recognized by processing the retinal image. Thus, the present invention is an effective assistance to the ophthalmologist in diagnosing glaucoma. The perimeter may be a generally known automatic perimeter, a flicker perimeter for glaucoma detection, a blue pyramid perimeter or a perimeter using contrast sensitivity measurement.
System
In another aspect the invention further relates to a system for assessing the presence or absence of the optic nerve head in a fundus image. Thus, the system according to the invention may be any system capable of conducting the method as described above as well as any combinations thereof within the scope of the inven- tion. Accordingly, the system comprises
a) an algorithm for establishing at least one starting point representative for an optic nerve head,
b) an algorithm point initiating a search for the best candidate optic nerve head area for each starting point, and for assigning a power to the candidate optic nerve head area,
c) an algorithm for selecting n best candidate optic nerve head area(s) with respect to the power assigned in b), wherein n is an integer > 1 ,
d) an algorithm for ranking the n selected candidate optic nerve head area(s) with respect to at least one validating criteria,
e) an algorithm for selecting the highest ranking candidate optic nerve head area fulfilling the validating criteria,
f) an algorithm for classifying the candidate optic nerve head area selected in e) with respect to a threshold as the optic nerve head area or not.
Any of the algorithms may be adapted to the various variations of the methods described above. Accordingly, the system may include algorithms to perform any of the methods described above. A graphical user interface module may operate in conjunction with a display screen of a display monitor. The graphical user interface may be implemented as part of the processing system to receive input data and commands from a conventional keyboard and mouse through an interface and display results on a display monitor. For simplicity of the explanation, many components of a conventional computer system have not been discussed such as address buffers, memory buffers, and other standard control circuits because these elements are well known in the art and a detailed description thereof is not necessary for understanding the present invention.
Pre-acquired image data can be fed directly into the processing system through a network interface and stored locally on a mass storage device and/or in a memory. Furthermore, image data may also be supplied over a network, through a portable mass storage medium such as a removable hard disk, optical disks, tape drives, or any other type of data transfer and/or storage devices which are known in the art.
One skilled in the art will recognize that a parallel computer platform having multiple processors is also a suitable hardware platform for use with a system according to the present invention. Such a configuration may include, but not be limited to, parallel machines and workstations with multiple processors. The processing system can be a single computer, or several computers can be connected through a communications network to create a logical processing system.
The present system allows the grader, that is the person normally grading the images to identify the optic nerve head area more rapidly and securely, if it is present in the image. Also, the present system allows an automatic detection of lesions and other pathologies of the retina without interference from the optic nerve head area, again as an aiding tool for the traditional grader.
By use of the present system it is also possible to arrange for recordation of the im- ages at one location and examining them at another location. For example the images may be recorded by any optician or physician or elsewhere and be transported to the examining specialist, either as photos or the like or on digital media. Accordingly, by use of the present system the need for decentral centers for recording the image, while the maintaining fewer expert graders could be realised. Furthermore, in addition to the communication of images and medical information between persons involved in the procedure, the network may carry data signals including control or image adjustment signals by which the expert examining the images at the examining unit directly controls the image acquisition occurring at the recordation localisation, i.e. the acquisition unit. In particular, such command signals as zoom magnification, steering adjustments, and wavelength of field illumination may be selectively varied remotely to achieve desired imaging effect. Thus, questionable tissue structures requiring greater magnification or a different perspective for their elucidation may be quickly resolved without ambiguity by varying such con- trol parameters. Furthermore, by switching illumination wavelengths views may be selectively taken to represent different layers of tissue, or to accentuate imaging of the vasculature and blood flow characteristics. In addition, where a specialized study such as fluorescence imaging is undertaken, the control signals may include time varying signals to initiate stimulation with certain wavelengths of light, to initiate im- aging at certain times after stimulation or delivery of dye or drugs, or other such precisely controlled imaging protocols. The digital data signals for these operations may be interfaced to the ophthalmic equipment in a relatively straightforward fashion, provided such equipment already has initiating switches or internal digital circuitry for controlling the particular parameters involved, or is capable of readily adapting electric controls to such control parameters as system focus, illumination and the like.
Also, the examining expert could be able to exert some treatment in the same remote manner. It will be understood that the imaging and ophthalmic treatment in- strumentation in this case will generally include a steering and stabilization system which maintains both instruments in alignment and stabilized on the structures appearing in the field of view. However, in view of the small but non-negligible time delays still involved between image acquisition and initiation of diagnostic or treatment activity at the examination site, in this aspect of the invention, the invention contemplates that the system control further includes image identification and correlation software which allows the ophthalmologist at site to identify particular positions in the retinal field of view, such as pinpointing particular vessels or tissue structures, and the image acquisition computer includes image recognition software which enables it to identify patterns in the video frames and correlate the identified position with each image frame as it is acquired at the acquisition site. For example, the image recognition software may lock onto a pattern of retinal vessels. Thus, despite the presence of saccades and other abrupt eye movements of the small retinal field which may occur over relatively brief time intervals, the ophthalmic instrumentation is aimed at the identified site in the field of view and remote treatment is achieved.
In addition to the foregoing operation, the invention further contemplates that the images provided by acquisition unit are processed for photogrammetric analysis of tissue features and optionally blood flow characteristics. This may be accomplished as follows. An image acquired at the recordation unit is sent to an examination unit, where it is displayed on the screen. As indicated schematically in the figure, such image may include a network of blood vessels having various diameters and lengths. These vessels include both arterial and venous capillaries constituting the blood supply and return network. At the examination unit, the workstation is equipped with a photogrammetric measurement program which for example may enable the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the width of the vessel and the subvessels to which it is connected, as well as the coordinates thereof.
The software for noting coordinates from the pixel positions and linking displayed features in a record, as well as submodules which determine vessel capacities and the like, are straightforward and readily built up from photogrammetric program techniques. Work station protocols may also be implemented to automatically map the vasculature as described above, or to compare two images taken at historically different times and identify or annotate the changes which have occurred, highlighting for the operator features such as vessel erosion, tissue which has changed colour, or other differences. In addition, a user graphical interface allows the specialist to type in diagnostic indications linked to the image, or to a particular feature ap- pearing at a location in the image, so that the image or processed version of it becomes more useful.
With suitable training, the relative health of the vessel, its blood carrying capacity and the like may also be visually observed and noted. This photogrammetric analy- sis allows a road map of the vasculature and its capacity to be compiled, together with annotations as to the extent of tissue health or disease apparent upon such inspection. Thus, a very precise and well-annotated medical record may be readily compiled and may be compared to a previously taken view for detailed evidence of changes over a period of time, or may be compared, for example, to immediately preceding angiographic views in order to assess the actual degree of blood flow occurring therein. As with the ophthalmologist's note pad entries at examination unit, the measurement entries at the examination unit become an annotated image record and are stored in the central library as part of the patient's record.
Unlike a simple medical record system, the present invention changes the dynamics of patient access to care, and the efficiency of delivery of ophthalmic expertise in a manner that solves an enormous current health care dilemma, namely, the obstacle to proper universal screening for diabetic retinopathy. A basic embodiment of the invention being thus disclosed and described, further variations and modifications will occur to those skilled in the art, and all such variations and modifications are encompassed within the scope of the invention as defined in the claims appended hereto.
Example
In the following one embodiment of the invention is described in relation to the drawings.
The ONH detection algorithm
Introduction
Possible initial locations or starting points (seed points) of the optical nerve head (ONH) are found by determining the local maximums in feature images. The feature images are produced by considering the flux in the vessel tree, vertical filtering of the vessel tree, and the intensity and variance of the un-sharpened fundus image.
It is generally a good assumption that the ONH is of a circular shape. Hence, a three dimensional exhaustive search is performed around the seed points, in order to de- termine the optimal position and diameter of a circle fitting the (possible) ONH. Utilizing rules regarding the structure of the vessel tree and the intensity and variance maximums of the un-sharpened image allows accepting or rejecting a possible ONH. If accepted, then the origin and diameter of the ONH is already known from the exhaustive search.
The following sections present the algorithm in details. The fundus image in Figure 1 serves to illustrate the individual steps in the ONH detection throughout the current research report.
Input/output from ONH detection library
The C/C++ header that defines the input/output from the ONH detection library is shown below.
/* 'ONHdetection' finds the Optic Nerve Head (ONH) in a fundus image.
Input: *image: Pointer to an image containing the fundus images including
ROI detection in the mask image (for full images the mask might be NULL).
The image might be a RGB image or a multi-frame image (then it is assumed the first frame (frame 0) contains the fundus image).
"vessel Pointer to a float image (PFFLOAT) containing the detected vessels in the fundus image. Pixel values must be positive or zero. Zero pixel value indicates no vessel, else the value is the width of the vessel. If the probability of a vessel is high, a value of
100 is added to the width. In other words: Uncertain vessels have values ]0;100[, and certain vessels have values > 100 (the actual width is given as 'pixel value minus 100').
expectedDiskDiameter Double value of the expected disk diameter (EDD) (notice it is the diameter, NOT radius).
The search of a fitting ONH will be conducted in the range [innerCircleFactor; outterCircleFactor]*EDD, which equals [0.6; 1.2]*EDD (the two values are hard-coded!).
Or in other words we search between -40% -> +20% of the EDD value.
Note: expectedDiskDiameter must be >= 10!
Output: return value: If 'ONHdetection' returns 'true' no error was encountered during the process, else 'false' is returned
foundONH If NO ONH is detected 'foundONH' is 'false', else 'true'
r.c, radius: ONHdetection' returns the ONH area as a circle with origin in (rows, columns) = (r,c) and radius (radius). */
#pragma once
#include <lmaging_Utilities.h>
extern bool fastcall ONHdetection(Clmage *image, Clmage *vessel, bool SfoundONH, double &r, double &c, double &radius, const double expectedDiskDiameter = 200, int (_cdecl *Progress)(double Pet) = NULL);
Expected disk diameter
Different resolution on the digital equipment used to convert from analog to digital images, degrees of fundus coverage, and the magnification in the fundus camera all change the resolution of the image (in μm/pixel). Therefore an Expected Disk Dia- meter (EDD) must be provided in order to limit the search for the ONH, but also to reduce the probability of getting a false ONH detection.
Typically, the EDD is found as the mean ONH radius measured in a number of im- ages. The measure is done in pixels.
Assuming a constant relationship between the height of the image and the size of the ONH given the angle, the system has predefined values for 30 and 45 degrees of fundus coverage.
The following statements are due to a review article by Jonas (Jonas, J.B., Oph- thalmoscopic Evaluation of the Optic Disc and Retinal Nerve Fiber Layer", American Academy of Ophthalmology (AAO), 2000) . The optic disc area is independent of age beyond and age of about 3 to 10 years. The optic disc measurements vary ac- cording to the method applied. Mean optic disc area of non-highly myopic Caucasians examined in various studies ranged between 2.1 mm2 and about 2.8 mm2.
Jonas does not state the direct measurements of the horizontal and vertical disc diameters. Instead, the calculation of the optic disc area follows a modified formula of an ellipse (Area = π/4 * horizontal diameter * vertical diameter). The optic disc has a slightly vertically oval form with the vertical diameter being about 7% to 10% larger than the horizontal one.
Using the modified formula of an ellipse and a constant relationship between the horizontal and vertical diameter (the mean between 7% and 10%) gives:
Area = π/4 * horizontal diameter * (horizontal diameter * (1+(7%+10%)/2))
= (IT1.085)/4 * (horizontal diameter)2
Using the previous equation, the area interval between 2.1 - 2.8 mm2 give a horizontal diameter that equals 1570 - 1813 μm.
How different features images are generated is shown below. 400 images from Welsh Community Diabetic Retinopathy Study (WCDRS) are used to generate seed points. Each seed point initiates a search for a local optimal ONH. Ideally, only one seed point should be generated but fundus images are fare from ideal. The anatomical variation makes it necessary to search at different locations - especially when the quality of the image is not perfect. Multiple seed points (or more exact multiple ONH candidates) are also used to determine the presence or absence of the ONH in the Field of View (FOV).
Variable definitions and values
Figure imgf000038_0001
Table 1. Controlling variables used when generating seed points. Pre-filtering
Four steps make up the pre-filtering: 1 ) Apply median filter
2) If image is a RGB image then the gray scale intensity is calculated as the mean of the red and green channel. Else it is assumed to be a gray scale image
3) Resize image 4) Apply un-sharp filter (figure 2)
The first step is applied to suppress noise. The size of the median filter equals 'medianKernedSize' (Table 1 ).
The ONH is normally more pronounced in the red channel when having poor image quality. Therefore it is also used when converting to a gray scale image in the second step.
The image is reduced in size mainly since the ONH is a large feature and experi- ments show no significant performance decrease. Currently the image is reduced so the EDD becomes 'RescaledEDD' pixels (Table 1 ). This means that the WCDRS study is reduced by a factor four (original EDD equals 200 pixels) which equals 'rescaleFactor' (Table 1 ). In the WRDCS, the rescaled rows and columns become 'if and 'cc' (Table 1 ), respectively.
Lastly, Uneven distribution of light in the image makes it necessary to remove the low frequencies. Many techniques exist but un-sharp filtering has been chosen since it is fast and works well. The size of the median kernel is 'unsharpKernelSize' (Table
1 ).
Vessel flux
Let
minFluxWidth = VesselThicknesSfmeanwidthFractiie'). where 'meanWidthFractile' is found in Table 1.
The endpoints of the thickest vessels (> minFluxWidth) initialize a search where the vessels are followed until the thickness drops below 'minFluxWidth' or the direction of the vessel changes too abruptly. A counter at each node point in the vessel tree is increased every time the node is visited. The flux seed points are defined as all the node points having a counter larger than or equal to the value of the largest count minus one (Figure 3).
Above we stated that the direction should not change too abruptly. More precisely, we follow the vessel having the largest angle with the incoming vessel (all vectors are pointing away from the node point) and a vessel width > minFluxWidth (Table 1 ). Cosine of the largest angle must be larger than 'cosAngleMin' (Table 1), else the search is stopped and the next endpoint is searched.
Vertical filtering of vessel image
The large vessels near the ONH are characterized by being almost vertical. Therefore two feature images emphasizing the vertical vessels are used to guide the search for the ONH, figure 5.
The vertical vessels are emphasized using an anisotropic Gaussian filter having a kernel size by (row, column) = ('rowStdKernelSize',' colStdKernelSize') (Table 1).
The two images that are filtered are described in the two following sub-sections. The two images do not generate identical seed points. Especially, when the vessel tracking having detected large wide (false) vessels (such as choroidal vessels or dark vessel-like structures in the fundus image) the MAT helps minimizing this problem, as the width has no influence after the MAT has been applied.
On the other hand, the large "density of vertical vessels" near the ONH is neglected when using the MAT image. Therefore the two feature images often compliment each other. Boolean vessel image
The result from the vessel-tracking algorithm is part of the input to the ONH algorithm. The input is converted to a Boolean image where the certain vessels form the foreground.
Medial Axis Transform (MAT) of the vessel image
The medial axis transform algorithm finds the mid-line of a structure. The midpoints are defined by the center of the largest circle toughing more than one point of the border (of the object). The MAT is calculated from the Boolean vessel image, figure 4.
Gaussian filtering of the pre-filtered image
The ONH is often the brightest area in the image. Gaussian filtering of the pre- filtered image is used in order to facilitate this observation. The kernel size is given by 'blurKernelSize' (Table 1 ), figure 6.
Variance filtering of the pre-filtered image
The variance feature is undoubtedly one of the best features, if the ONH is present in the FOV. But as any other single feature, this feature may also fail. Especially, when the contrast of the image is poor.
Simply applying a variance filter (or as in this case, a standard deviation filter) gives a "blocky" feature image. Using a weighted filter solves this. Or equivalent, a low pass filter, such as the Gaussian filter, can be applied afterwards. This could also have been achieved by using a weighted variance filter defined as the convolution of the original variance filter and the Gaussian filter.
The kernel size of the two filters, the standard deviation filter and the Gaussian filter, are 'stdKernelSize' (Table 1), and the result is shown in figure 7. Seed points selection
Above is described how the feature images were generated. The present section describes the extraction of seed points from the feature images, the seed points are shown in figure 9.
Locally maximums are found using the watershed algorithm described in co-pending patent application entitled "Lesion detection", on each feature image. Seed points are selected as the local maximums having a value greater than or equal to:
val > max - (max - min)/2,
where 'max' and 'min' are the global maximum and global minimum, respectively. 'val' is the value of the local maximum being tested.
ONH circle optimization
The following sub-sections describe how three-dimensional (3D) exhaustive search is used to determine the exact placement of the ONH candidates.
Variable definitions and values
Figure imgf000042_0001
Figure imgf000043_0001
Table 2. Controlling variables used when searching for optimal ONH.
Avoidance mask
Some areas (pixels) should not be included when searching for suitable ONH candidates. The following section uses the gradient in order to find the optimal placement of the ONH candidate. Obviously, high gradients between the brighter ROI and the darker surrounding area should be avoided, and pixels near the ROI border are therefore excluded from the calculation.
More surprisingly, it is also necessary to ensure that the high gradients between the background of the fundus and the vessels are avoided. If not, the optimization is likely to fail, either by missing the real ONH or other candidates get to much power from the erroneous gradients and thereby "shadow" the correct ONH. (The ROI and the dilated vessels together give the avoidance mask. If the "endpoint" in the gradient is located in the dark area, the gradient is given the value of zero, see figure 8).
Optimization of local optimal ONH candidates
In order to keep the degrees of freedom at a minimum, the ONH is assumed to have a circular shape leaving the center and the radius as the only unknowns, in total, three degrees of freedom. The gradient orthogonal to a circle is used as cost function to find the border of the ONH (Figure 14). Figure 14b is chosen in the current implementation. The number of points on the circle is 'nCircleResolution' (Table 2).
For each seed point a 3D exhaustive search is performed in order to find the maximal gradient. The size of the search area around the seed point is '2* boxHalf- Size+V with step size 'boxStep' (Table 2). The last search dimension is the radius of the circle. Here, we search the interval [innerCircle; outterCirde] with step size 'box- Step' (Table 2). The radius of the outer circle is 'gradientCircleEnlargement' (Table 2) times the radius of the inner circle (Figure 14).
Significant seed up is achieved by reusing intermediate computations. One float- image with two frames is allocated. When an optimal radius is found for a specific coordinate (pixel), the radius (actually a positive integer) and the circle power are stored in the float-image. If a region overlaps an already calculated coordinate (pixel), the optimal values (radius and circle power) are taken from the float-image. Since only positive radii are searched, it is quite simple to test if or not an optimiza- tion has already been performed for a specific coordinate.
Should the value of 'gradientCircleEnlargement' be large or small? Let the sums of intensities at the 'nCircleResolution' circle-points be called the circle power. As shown in Figure 14 the sum of the individual gradients are equal to the difference between the circle powers in the outer minus and the inner circle. The means that the gradient is calculated on a one-dimensional function as illustrated in Figure 15 'gradientCircleEnlargement' should be chosen as small as possible in order to calculate the local gradient, on the other side, the measure becomes more sensitive to noise and the digitalization. The value shown in Table 2 seems to be a very good choice.
Using the avoidance mask partially avoids the (very) high false gradients. This means that the gradient is "excluded" if the end point is placed outside the mask. Notice, that "excluded" means that the gradient value is set to zero. This is neces- sary in order to avoid false high gradients and "wrong" circle powers. If the mean is used instead, the optimization has a tendency to favor placement outside the mask.
Seed points may be placed so there search areas overlaps, which may result in the same optimal position (and therefore also the same optimal radius). These identical results are removed and thereby producing an array of unique local optimal ONH candidates. A descending sort with respect to circle power is performed on the array. (Figure 11 shows the optimized ONH candidates. The optimization was initialized from the seed points seen. The intensities of the circles are equivalent with the circle power assigned to each ONH candidate). Choose the right ONH among candidates
In order to choose the real ONH among the candidates, or not to chose any ONH different rules are utilized.
Variable definitions and values
Figure imgf000045_0001
Table 3. Controlling variables used when searching for optimal ONH. Tangential vessel exclusion
When looking at fundus images, one observes that no vessels near the ONH are likely to be tangential to the ONH, or more technically speaking: the orthogonal pro- jection of the ONH center point onto a linear vessel segment sends the projected point outside this line segment (Figure 17).
One could search for tangential vessel for each pixel in the image and thereby make the tangential vessel exclusion image. Instead, it is much faster to go through all the straight vessel segments and let them mark an exclusion area in a tangential vessel exclusion image. This is depicted in Figure 18.
The vessel graph is used to define SVSs. Let two connected node points define a line. A confidence band is placed around this line. The width of the band is '2*confidenceϋmit ' (Table 3). Recursively, connected node points are included as long as they are inside the confidence band.
The included node points are searched to find the two node points farthest away. The resulting two node points define the SVS. The Euclidian distance between them defines the length of the SVS.
The SVS is only regarded as being a significant vessel segment if it fulfills one of two rules: If the caliber of the SVS has a caliber < or >
Figure imgf000046_0001
the segment must be longer than 'longϋneLength' or ' minLineLength' (Table 3), respectively (it must either be thin and long or thick and short).
Only significant vessel segments are used to construct the tangential vessel exclusion image. As seen from Figure 18 the inner and outer sides of the (exclusion) boxes are placed 'innerTline' and OutterTline' (Table 3) pixels away from the SVS, respectively. The length of the box is as long as the SVS. (Figure 10 shows the tangential vessel exclusion area. It is seen that only the center of the ONH is not masked off). Are the arcades crossing the ONH candidate?
The vicinity of the ONH candidate is searched for vessel crossings. It is a good assumption that large vessels (arcades) vertically cross the top and bottom of the peri- phery of the ONH. However, this restriction is relaxed since the vessel tracking may not be capable of finding the arcades e.g. if the image quality is degraded. It is also a common problem that the vessel tracking is not capable of tracking all the way into the ONH but stops a distance away from it. The projection of the 3D fundus to a 2D image may also introduce "mysterious" changes in the direction of the vessels.
Due to the above-mentioned drawbacks, an enlarged ONH radius is used when searching for crossings between the ONH periphery and the vertical vessels. The search radius is enlarged by a factor 'circleEnlargeFacto (Table 3).
Arcades are assumed to have a caliber larger than or equal to 'minWidth', which is defined as
minWidth = VesselWidthDistribution(v,βsSe/Fracωβ).
As mentioned earlier, fundus images are fare from being ideal, for that reason arcades do not have to cross the ONH periphery vertically. A "vertical" arcade is defined to have a caliber ≥'minWidth' and must cross the top or bottom quadrant with an angle less than 'maxVesselAngle '(Table 3). The angle is calculated as the angle between the vessel segment and the vector going from the center of the ONH can- didate to the point on the periphery.
Presents of intensity and/or variance seed points
In order to locate the ONH, one of the best features is the variance feature as stated in Section 0. The intensity feature is also a very good feature. This means that the likelihood of having a correct identified ONH containing neither a seed point from one of these features is negligible.
This fact is utilized in the two rules of selection. Relative circle power
Many seed points are found, typically between 15-30. After optimization this number is normally decreased since some seed points may result in the same local optimum (the example image (Figure 1) has 23 seed points and 12 unique ONH candidates, so approximately half of the seed points result in a new local optimum).
The ONH candidates are regarded as being samples from the fundus image and thereby use them to detect the normal background "variation". Empirically, it has been found that the ratio of circle powers between the best ONH candidate (see below for the best candidate) and the OnhCandidate2CompareWith'-\h should be above (or equal) to 'gradientPowerFactor' (Table 3):
OnhCandidate2CompareWith' is incremented by one, if the center of the OnhCandi- date2CompareWith'-\h ONH candidate is inside the best candidate. This is done until the center is outside the best candidate or there are no more candidates.
If there are no candidates to compare with the test is skipped; which means that the best ONH is accepted as having enough relative circle power, since it is unique in the sense that only a very few number of seed points are found even though a variety of different methods are used when generating seed point.
If the relative circle power is too low, the best ONH candidate is rejected which means that no ONH is detected in that fundus image.
Figure 16 shows the circle power for the example fundus image. The ratio for this image is 1021/222 = 4.6, which is well above the value of 'gradientPowerFactor'. Notice that the count is made from zero, which is usual in C/C++.
Rules of selection
The ONH candidates are decreasingly sorted with respect to the circle power. Only the first 'maxSeedPointTestNumber' (Table 3) is being tested (the most powerful circles). Then two Boolean rules are calculated: rulel = (topB && bottomB) && (intMaxB || varMaxB), and rule2 = (topB || bottomB) && (intMaxB && varMaxB),
where topB and bottomB are two Booleans that are true if an arcade is crossing the periphery of the ONH through the top and bottom quadrant, respectively. The Booleans intMaxB and varMaxB are true if an intensity and variance seed point is present inside the ONH candidate, respectively.
If one of the rules is true, adjusted circle power is calculated as:
adjustedPower = circle power * (rulel ? 1.0 : 'rule∑correction'),
else (rulel and rule2 are both false)
adjustedPower = -1
where the first statement means that if rulel is true then the adjusted power equals the circle power else 'rule2correction' is multiplied with the circle power. The ration- ale for the adjustment being that we have more confidence in the candidate when both a top and a bottom arcade is crossing the periphery, than if only one arcade does it.
Among the first 'maxSeedPointTestNumber' where rulel or rule2 are true, two can- didates are flagged, namely 1 ) the candidate having the highest adjust power which does not have the center point masked out in the tangential vessel image and 2) the candidate having the highest adjust power which has the center point masked out in the tangential vessel image. The two flags are called OptimalGradNr' and 'optimal- MaskNf (initially the two flags equal the number of the last candidate).
If optimalMaskNr < optimalGradNr, which means that a more powerful adjusted ONH candidate exists, then it is tested if the tangential masked candidate should be chosen instead: the center of the masked candidate should be inside the periphery of the OptimalGradNr' candidate, and satisfy a visibility criteria. MaskCorrection * Ad.Power{0p,ιmaιMaSkNr) > Ad.Power(0pf,ma,GradN Having adjusted for possible very powerful nearby candidates and taken into account the crossings of vertical arcades, the only remaining rule that may disqualify an ONH candidate is the relative power. As described in above the relative power must be above 'gradientPowerFactor'.
If all the previous steps were successfully passed, the ONH detection library returns with a true 'foundONH' and the center and radius of the ONH, else 'foundONH' is false.
Results
In 400 images (WCDRS study), 94% of the ONH were found correctly (defined as 50% overlap with true ONH). The Positive Predicted Value (PPV) was 97%.

Claims

1. A method for assessing the presence or absence of the optic nerve head in a fundus image, comprising
a) establishing n candidate optic nerve head area(s), wherein n is an integer > 1 ,
b) ranking the n selected candidate optic nerve head area(s) with respect to at least one validating criteria,
c) selecting the highest ranking candidate optic nerve head area fulfilling the validating criteria,
d) classifying the candidate optic nerve head area selected in step c) with respect to a threshold as the optic nerve head area or not.
2. The method according to claim 1 , wherein the method comprises establishing at least one starting point representative for an optic nerve head, and for each starting point initiating a search for candidate optic nerve head area(s).
3. The method according to claim 1 or 2, wherein a power is assigned to the candidate optic nerve head area(s).
4. The method according to claim 3, wherein the n best candidate optic nerve head area(s) are selected with respect to the power assigned, before ranking in step b).
5. The method according to any of the preceding claims, wherein the image is presented on a medium selected from dias, paper photos or digital photos.
6. The method according to any of the preceding claims, wherein the image is a colour image.
7. The method according to any of the preceding claims, wherein the green and/or the red channel is used for assaying the presence or absence of the optic nerve head area.
8. The method according to claim 7, wherein an average of the green and the red channel is used.
9. The method according to any of the preceding claims, wherein the low frequencies of the image are removed before establishing starting points.
10. The method according to claim 9, wherein the image is unsharp filtered by median or mean filtering the image and subtracting the filtered result from the image.
11. The method according to any of the preceding claims, wherein the starting points are established as extrema of a filtered image, wherein the filtering may be linear and/or non-linear.
12. The method according to any of the preceding claims, wherein the starting points are established by filtering the image using template matching.
13. The method according to any of the preceding claims, wherein starting points are established by combining two or more filters.
14. The method according to any of the preceding claims, wherein the starting point are established individually by one or more of the following steps:
- establishing at least one extremum in the image based on vessel branching points, preferably establishing at least one maximum in the image based on vessel branching points and/or
- establishing at least one extremum in the image based on a filter enhancing sagital structures, preferably establishing at least one maximum in the image based on a filter enhancing sagital structures and/or - establishing at least one intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and/or
- establishing at least one variance extremum in the image, preferably establishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
15. The method according to claim 14, wherein the variance extremum is a weighted variance maximum, preferably a local variance maximum, more preferably a local weighted variance maximum.
16. The method according to claim 14, wherein at least one of the starting points are established on the unsharp filtered image as defined in claim 9-10.
17. The method according to any of the preceding claims, wherein at least some of the starting points are established at random.
18. The method according to any of the preceding claims, wherein the starting points are established as grid points, such as regularly distributed or unregularly distributed in the image.
19. The method according to any of the preceding claims, wherein starting points are established by at least two of the steps defined in claim 14, such as by at least three of the steps defined in claim 14, such as by at least four of the steps defined in claim 14, such as at least five of the steps defined in claim 14.
20. The method according to any of the preceding claims, wherein the best candidate optic nerve head area is represented as a closed curve, such as a circle, an ellipse, a snake, a polygon.
21. The method according to any of the preceding claims, wherein the best candidate optic nerve head area is represented as a open curve, such as a circle, an ellipse, a snake, a polygon.
22. The method according to any of the preceding claims, wherein the best candidate optic nerve head area is represented as an area enclosed by a circle, an ellipse, a snake, or a polygon.
23. The method according to any of the preceding claims, wherein the power is calculated as a measure of the candidate optic nerve head area edge, said measure being selected from the summarized gradient, the summarized variance and/or the mean of the summarized variance, Laplace filtering, the curvature, the intensity, the skewness, the kurtosis, derived measure from Fourier trans- formation, derived measure from co-occurrence matrix, and derived measure from fractale dimension.
24. The method according to any of the preceding claims, wherein the estimation of candidate optic nerve head areas is adjusted with respect to vessels and/or end- of-image appearing in the image.
25. The method according to any of the preceding claims, wherein the estimation of candidate optic nerve head areas is preceded by detection of vessels and/or end-of-image in the image.
26. The method according to any of the preceding claims, wherein the power is weighted with respect to presence of vessels in the image, and to end-of-image.
27. The method according to any of the preceding claims, wherein the search initi- ated is a search for a centre of a best matching circle, wherein the centre is positioned in a search region of a predetermined size established around each starting point.
28. The method according to claim 27, wherein the radius of the best matching circle is in the range of ± 100 % of the expected diameter of the optic nerve head, such as in the range of ± 90 % of the expected diameter of the optic nerve head, such as in the range of ± 80 % of the expected diameter of the optic nerve head, such as in the range of ± 70 % of the expected diameter of the optic nerve head, such as in the range of ± 60 % of the expected diameter of the optic nerve head, such as in the range of ± 50 % of the expected diameter of the optic nerve head, such as in the range of ± 40 % of the expected diameter of the optic nerve head, such as in the range of ± 35 % of the expected diameter of the optic nerve head, such as in the range of ± 30 % of the expected diameter of the optic nerve head, such as in the range of ± 25 % of the expected diameter of the optic nerve head, such as in the range of ± 20 % of the expected diameter of the optic nerve head, such as in the range of + 15 % of the expected diameter of the optic nerve head, such as in the range of ± 10 % of the expected diameter of the optic nerve head.
29. The method according to claim 27 or 28, wherein the search region has any geometric form, such as a circular region, or a rectangular region
30. The method according to claim 28, wherein the expected optic nerve head diameter is estimated from the caliber of vessels in the image.
31. The method according to claim 14, wherein independent arterioles and venoles has been identified before establishing extremas based on vessel branching points.
32. The method according to any of the preceding claims, wherein the candidate optic nerve head area is ranked with respect to the presence or absence of at least one of the following validating criteria:
- any substantial sagital vessels detected extending out superior from and/or inferior from the candidate optic nerve head area,
- localisation of an intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, in the candidate optic nerve head area,
- localisation of a variance extremum in the image, preferably establishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof in the candidate optic nerve head,
- no vessels tangential to the candidate optic nerve head area within a predetermined distance from the candidate optic nerve head area.
33. The method according to claim 32, wherein the candidate optic nerve head area is ranked with respect to the presence or absence of the following validating criteria:
any substantial sagital vessels detected extending out superior and/or inferior from the candidate optic nerve head area, and
localisation of an intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, in the candidate optic nerve head area, or
localisation of a variance extremum in the image, preferably establishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof in the candidate optic nerve head.
34. The method according to claim 32, wherein the candidate optic nerve head area is ranked with respect to the presence or absence of the following validating criteria:
substantial sagital vessels detected extending out superior to and out inferior to from the candidate optic nerve head area, and
localisation of an intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, in the candidate optic nerve head area, and/or
localisation of a variance extremum in the image, preferably establishing at least one variance maximum in the image, more preferably at least one local variance maximum, on the image function, wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof in the candidate optic nerve head.
35. The method according to claim 32, wherein the candidate optic nerve head area is ranked with respect to the presence or absence of the following validating criteria:
no vessels tangential to the candidate optic nerve head area within a predetermined distance from the candidate optic nerve head area.
36. The method according to any of the claims 32-35, wherein a candidate optic nerve head area not fulfilling at least one of the criteria is rejected as a candidate optic nerve head area.
37. The method according to claim 34, wherein a candidate optic nerve head area not fulfilling at least two of the criteria is rejected as a candidate optic nerve head area.
38. The method according to any of the claims 32-37, wherein a candidate optic nerve head area is rejected if not fulfilling one of the following combination of criteria:
substantial sagital vessels detected extending out superior and inferior from the candidate optic nerve head area, and localisation of intensity extremum in the candidate optic nerve head area, preferably localisation of intensity maximum in the candidate optic nerve head, or localisation of variance extremum in the can- didate optic nerve head area, preferably localisation of variance maximum in the candidate optic nerve head, or
substantial sagital vessels detected extending out superior or inferior from the candidate optic nerve head area, and localisation of intensity extremum in the candidate optic nerve head area, preferably localisation of intensity maximum in the candidate optic nerve head, and localisation of variance extremum in the candidate optic nerve head area, preferably localisation of variance maximum in the candidate optic nerve head.
39. The method according to any of the preceding claims, wherein the candidate optic nerve head is classified as the optic nerve head area in step d), and the power is calculated as summarized gradient, if the, optionally weighted, power of optic nerve head area is at least k times higher than the, optionally weighted, power of at least one of the other candidate optic nerve head areas, k being at least 1.01 , such as k being in the range of from 1.01 to 10, such as k being in the range of from 1.1 to 8 such as k being in the range of from 1.2 to 6 such as k being in the range of from 1.3 to 5 such as k being in the range of from 1.4 to 4 such as k being in the range of from 1.5 to 3 such as k being in the range of from 1.6 to 2,5 such as k being in the range of from 1.7 to 2.0, otherwise the absence of the optic nerve head in the image is assessed.
40. A system for assessing the presence or absence of the optic nerve head in a fundus image, comprising a) an algorithm point establishing n candidate optic nerve head area(s), wherein n is an integer ≥ 1 ,
b) an algorithm for ranking the n selected candidate optic nerve head area(s) with respect to at least one validating criterion,
c) an algorithm for selecting the highest ranking candidate optic nerve head area fulfilling the validating criteria,
d) an algorithm for classifying the candidate optic nerve head area selected in c) with respect to a threshold as the optic nerve head area or not.
41. A method for establishing a coordinate system on an image of the ocular fundus, comprising
assessing the presence of the optic nerve head area by a method as defined in any of the claims 1-39,
assessing the presence of the fovea region,
arranging a coordinate system having one axis parallel with an axis through the optic nerve head and the fovea.
42. A method for establishing a coordinate system on an image of the ocular fundus, comprising
assessing the presence of the optic nerve head area by a method as defined in any of the claims 1-39, assessing the optic nerve head diameter,
assessing the presence of the fovea region,
arranging a coordinate system based on distance from the fovea, such as number of optic nerve head diameters from fovea.
43. A method for grading lesions in a fundus image, comprising establishing a coordinate system by the method as defined in claim 41 or 42, and grading the lesions with respect to distance to the fovea region, such as distance divided by optic nerve head diameter.
44. A method for registering at least two fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined in any of claims 1-39, and orienting the images with respect to the optic nerve head.
45. A method for detecting vessels in a fundus image, comprising
a) estimating the localisation of the optic nerve head region by a method as defined in any of claims 1-39,
b) estimating vessels based on the localisation of the optic nerve head area,
c) optionally repeating steps a) and b) at least once.
46. A method for assessing the presence or absence of lesions in a fundus image, comprising
a) estimating the localisation of the optic nerve head area in the image by a method as defined in any of claims 1-39, if present, masking a region comprising the op- tic nerve head area, and
b) estimating the presence or absence of lesions in the remaining image.
47. The method according to claim 46, wherein a region corresponding to at least 1.1 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 1.3 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 1.4 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 1.5 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 1.7 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 2.0 times the diameter of the estimated optic nerve head area is masked.
48. A method for detecting indicators of glaucoma in a fundus image, comprising
a) estimating the localisation of the optic nerve head area in the image, by a method as defined in any of claims 1-39, if present, and
b) detecting indicators of glaucoma related to the optic nerve head area.
PCT/DK2002/000663 2001-10-03 2002-10-03 Detection of optic nerve head in a fundus image WO2003030075A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
DKPA200101449 2001-10-03
DKPA200101449 2001-10-03
DKPA200200632 2002-04-25
DKPA200200632 2002-04-25
US37623202P 2002-04-30 2002-04-30
US60/376,232 2002-04-30

Publications (1)

Publication Number Publication Date
WO2003030075A1 true WO2003030075A1 (en) 2003-04-10

Family

ID=27222544

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DK2002/000663 WO2003030075A1 (en) 2001-10-03 2002-10-03 Detection of optic nerve head in a fundus image

Country Status (1)

Country Link
WO (1) WO2003030075A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006067226A3 (en) * 2004-12-22 2006-09-21 Dilab I Lund Ab Biometric identification of laboratory animals
DE102005058217A1 (en) * 2005-12-06 2007-06-28 Siemens Ag Method and system for computer-aided detection of high-contrast objects in tomographic images
CN100367310C (en) * 2004-04-08 2008-02-06 复旦大学 Retinal Ganglion Cell Receptive Field Scale Variable Hierarchical Network Model and Its Algorithm
US7583827B2 (en) 2001-10-03 2009-09-01 Retinalyze Danmark A/S Assessment of lesions in an image
GB2470727A (en) * 2009-06-02 2010-12-08 Univ Aberdeen Processing retinal images using mask data from reference images
EP2779095A3 (en) * 2013-03-15 2016-03-02 Kabushiki Kaisha TOPCON Optic disc image segmentation method
CN110543802A (en) * 2018-05-29 2019-12-06 北京大恒普信医疗技术有限公司 Method and device for identifying left and right eyes in fundus images
CN112712521A (en) * 2021-01-18 2021-04-27 佛山科学技术学院 Automatic fundus optic disk positioning method based on global gradient search

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6053865A (en) * 1993-09-21 2000-04-25 Kabushiki Kaisha Topcon Retinal disease analyzer

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6053865A (en) * 1993-09-21 2000-04-25 Kabushiki Kaisha Topcon Retinal disease analyzer

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
AKITA K ET AL: "A COMPUTER METHOD OF UNDERSTANDING OCULAR FUNDUS IMAGES", PATTERN RECOGNITION, PERGAMON PRESS INC. ELMSFORD, N.Y, US, vol. 15, no. 6, 1982, pages 431 - 443, XP000877036, ISSN: 0031-3203 *
GOLDBAUWM M ET AL: "AUTOMATED DIAGNOSIS AND IMAGE UNDERSTANDING WITH OBJECT EXTRACTION,OBJECT CLASSIFICATION, AND INFERENCING IN RETINAL IMAGES", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP) LAUSANNE, SEPT. 16 - 19, 1996, NEW YORK, IEEE, US, vol. 3, 16 September 1996 (1996-09-16), pages 695 - 698, XP000704110, ISBN: 0-7803-3259-8 *
WOOD S L ET AL: "Estimation Of Nerve Fiber Loss From Digitized Retinal Images", ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, 1991. VOL.13: 1991., PROCEEDINGS OF THE ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ORLANDO, FL, USA 31 OCT.-3 NOV. 1991, NEW YORK, NY, USA,IEEE, US, 31 October 1991 (1991-10-31), pages 269 - 270, XP010101608, ISBN: 0-7803-0216-8 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7583827B2 (en) 2001-10-03 2009-09-01 Retinalyze Danmark A/S Assessment of lesions in an image
CN100367310C (en) * 2004-04-08 2008-02-06 复旦大学 Retinal Ganglion Cell Receptive Field Scale Variable Hierarchical Network Model and Its Algorithm
WO2006067226A3 (en) * 2004-12-22 2006-09-21 Dilab I Lund Ab Biometric identification of laboratory animals
DE102005058217A1 (en) * 2005-12-06 2007-06-28 Siemens Ag Method and system for computer-aided detection of high-contrast objects in tomographic images
DE102005058217B4 (en) * 2005-12-06 2013-06-06 Siemens Aktiengesellschaft Method and system for computer-aided detection of high-contrast objects in tomographic images
GB2470727A (en) * 2009-06-02 2010-12-08 Univ Aberdeen Processing retinal images using mask data from reference images
EP2779095A3 (en) * 2013-03-15 2016-03-02 Kabushiki Kaisha TOPCON Optic disc image segmentation method
US10497124B2 (en) 2013-03-15 2019-12-03 Kabushiki Kaisha Topcon Optic disc image segmentation method and apparatus
CN110543802A (en) * 2018-05-29 2019-12-06 北京大恒普信医疗技术有限公司 Method and device for identifying left and right eyes in fundus images
CN112712521A (en) * 2021-01-18 2021-04-27 佛山科学技术学院 Automatic fundus optic disk positioning method based on global gradient search
CN112712521B (en) * 2021-01-18 2023-12-12 佛山科学技术学院 Automatic positioning method of fundus optic disk based on global gradient search and storage medium thereof

Similar Documents

Publication Publication Date Title
US7583827B2 (en) Assessment of lesions in an image
Hoover et al. Locating the optic nerve in a retinal image using the fuzzy convergence of the blood vessels
US6996260B1 (en) Analysis of fundus images
EP3937753A1 (en) Supervised machine learning based multi-task artificial intelligence classification of retinopathies
Kipli et al. A review on the extraction of quantitative retinal microvascular image feature
WO2018116321A2 (en) Retinal fundus image processing method
Zhu et al. Digital image processing for ophthalmology: Detection of the optic nerve head
Giancardo Automated fundus images analysis techniques to screen retinal diseases in diabetic patients
WO2003030075A1 (en) Detection of optic nerve head in a fundus image
WO2003030073A1 (en) Quality measure
Zhou et al. Computer aided diagnosis for diabetic retinopathy based on fundus image
Mangrulkar Retinal image classification technique for diabetes identification
Noronha et al. A review of fundus image analysis for the automated detection of diabetic retinopathy
WO2004082453A2 (en) Assessment of lesions in an image
Khatter et al. Retinal vessel segmentation using Robinson compass mask and fuzzy c-means
Niemeijer Automatic detection of diabetic retinopathy in digital fundus photographs
WO2003030101A2 (en) Detection of vessels in an image
Mohammadi et al. The computer based method to diabetic retinopathy assessment in retinal images: a review.
DK1444635T3 (en) Assessment of lesions in an image
Patil et al. Screening and detection of diabetic retinopathy by using engineering concepts
Lin et al. Vascular tree construction with anatomical realism for retinal images
Kayte Design and Development of Non-Proliferative Diabetic Retinopathy Detection Technique using Image Features Extraction Techniques
de Moura et al. Artery/vein vessel tree identification in near-infrared reflectance retinographies
Raju DETECTION OF DIABETIC RETINOPATHY USING IMAGE PROCESSING
Balasubramanian et al. Algorithms for detecting glaucomatous structural changes in the optic nerve head

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MK MN MW MX MZ NO NZ OM PH PT RO RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG US UZ VC VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载