WO2008148924A1 - Method for automatically improving images and sequences with spatially varying degradation - Google Patents
Method for automatically improving images and sequences with spatially varying degradation Download PDFInfo
- Publication number
- WO2008148924A1 WO2008148924A1 PCT/ES2008/070106 ES2008070106W WO2008148924A1 WO 2008148924 A1 WO2008148924 A1 WO 2008148924A1 ES 2008070106 W ES2008070106 W ES 2008070106W WO 2008148924 A1 WO2008148924 A1 WO 2008148924A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- automatic method
- method described
- image
- video
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 238000006731 degradation reaction Methods 0.000 title claims abstract description 16
- 230000015556 catabolic process Effects 0.000 title claims abstract description 15
- 230000002411 adverse Effects 0.000 claims abstract 2
- 238000009826 distribution Methods 0.000 claims description 11
- 230000004927 fusion Effects 0.000 claims description 11
- 238000004364 calculation method Methods 0.000 claims description 8
- 208000002177 Cataract Diseases 0.000 claims description 3
- 230000006835 compression Effects 0.000 claims description 2
- 238000007906 compression Methods 0.000 claims description 2
- 238000001514 detection method Methods 0.000 claims description 2
- 241000124008 Mammalia Species 0.000 claims 1
- 238000007781 pre-processing Methods 0.000 claims 1
- 230000007613 environmental effect Effects 0.000 abstract 1
- 230000006872 improvement Effects 0.000 description 12
- 239000003595 mist Substances 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000008030 elimination Effects 0.000 description 4
- 238000003379 elimination reaction Methods 0.000 description 4
- 238000010606 normalization Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 239000000443 aerosol Substances 0.000 description 2
- 238000005315 distribution function Methods 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 241001156002 Anthonomus pomorum Species 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
- G06F18/256—Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
- G06T2207/20012—Locally adaptive
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- the method described here may have applications in other areas of interest, such as biomedical images, and more specifically the fundus scanning systems in the presence of cataracts in the lens, which makes it difficult to observe the retina in such cases.
- the proposed method would allow obtaining an improvement of the acquired images by reducing the effect produced by the turbidity of the lens.
- the invention procedure proposed here consists of a new fusion procedure between the different frames of the input sequence, which allows solving the problem of eliminating the degradations caused by the presence of mists, aerosols, etc.
- the advantages of the procedure proposed here are:
- the method proposed in this invention patent consists in obtaining an improvement in the quality of the images provided by a capture device, such as a conventional CCD camera. This procedure is characterized in its analysis phase by the use of a space-frequency distribution for the images. Subsequently, an unpublished image fusion algorithm is applied, based on a measure of anisotropy that allows obtaining an improved image.
- the usefulness of the proposed method extends to video surveillance, security, navigation, remote sensing and biomedicine applications. Detailed Description of the invention.
- the method proposed in this invention patent consists of four stages: - Obtaining the sequence of the video images.
- Video sequences obtained and stored by usual procedures in digital imaging technology are considered.
- the sequences must be subdivided into frames for processing and analysis frame by frame.
- the frequency information of a signal can be obtained by associating a vector with a discrete position n of the signal with the discrete values provided by the Wigner pseudo-distribution (PWD).
- PWD Wigner pseudo-distribution
- the calculation of the PWD described above for each pixel of the image is performed first.
- the space-frequency information of the images is obtained simultaneously.
- the input sequence corresponds to a multi-temporal sequence, with the presence of mist
- the pixels corresponding to the fog are replaced by fog-free pixels, from the use of the frames of the input sequence and after applying a fusion procedure between them.
- the PWD presents coefficients with different magnitude for each position n, due to changes in the values of the signal according to its position.
- One way to quantify these differences between the PWD is by means of a measurement made in each position n, for which the Rényi entropy of said local PWD can be used.
- n is the discrete spatial variable
- k is the discrete frequency variable
- a is a parameter whose recommended value must be equal to or greater than 2 [15].
- P represents the PWD defined in Eq. (one ).
- This measurement can be interpreted at the pixel level by means of:
- P has to be normalized, at the pixel level, as follows:
- the frequency variable -N 12 ⁇ k ⁇ N 12
- Equation (6) provides an entropy value for each pixel of the image. Based on this, it is possible to define figures of merit for each pixel of the image given by R [n; ⁇ s ].
- ⁇ s e [ ⁇ ] r ⁇ 2 , ..., ⁇ s ] has been introduced to represent S different orientations to measure the entropy, based on as many orientations of the PWD.
- the expected value of the entropy is obtained as the average of a certain number of analysis directions and as a consequence of the foregoing an anisotropy value. Specifically, the following orientations were chosen: (0, 30, 60, 90, 120 and 150 °).
- the value ⁇ [n] given by Eq. (10) is defined as an anisotropy index of the pixel n and measures the quality of the local information of that pixel. Averaged over the image allows defining a measure of the overall quality of the image considered.
- the value ⁇ of is given by the following expression:
- a ⁇ [0, 1] determines the importance of the mist
- n is the size of the window used in the calculation of the entropy
- B is the value of the level gray more characteristic of the fog
- C is the maximum observed in the z image.
- DM represents a decision map that indicates the image t of the input sequence from which the value of the pixel n is taken to constitute the resulting image. This map allows an improved image to be constructed from the fusion of the input sequence by means of the selection of the highest quality pixels, according to a criterion of maximum anisotropy.
- the present technique can be used in color images by simply applying the procedure described here to each of the color channels e.g. RGB (red, green and blue) or in any other color space, although the best results will be obtained in the luminance channel which is where the highest degree of discrimination can be obtained.
- RGB red, green and blue
- the described technique can be applied to video sequences (that is, a succession of frames in time), applying it to a set of frames and moving them successively in time.
- the technique described here can be applied in ophthalmology for the exploration of the fundus with the presence of cataracts, since these introduce a turbidity in the lens that can be modeled from a analogous to that of the cloud cover or fog in the case of images in remote perception. For this, it is necessary to have a set of frames and thus be able to apply the developed fusion technique.
- the compression rate should be low and should not exceed 80: 100 so that the fusion process can provide satisfactory results.
- the first scenario consists of the improvement and suppression of noise in coastal video surveillance images ( Figures 2-4).
- the second scenario the presence of the mist is even more severe than in the previous case (Fig. 5).
- the third scenario consists of a sequence of multi-temporal images represented in Figure 7.
- Figure 8 presents the result of the application of the method presented in which a very significant elimination of the cloud cover is observed. DESCRIPTION OF THE FIGURES.
- Fig. 1 General scheme of the proposed method. After a previous registration stage, the anisotropy measurement of each of them is extracted, and then the image is improved after a fusion process.
- Fig. 2. Example of a foggy frame taken from a video surveillance sequence, together with its gray level histogram Fig. 3. Result of performing a histogram equalization. Note that this operation does not produce a significant improvement, but instead a worsening of the contrast without a noise reduction Fig. 4 Result of applying the proposed method. Note how there is an improvement in contrast as well as a reduction in noise. The improvement of the image quality is reflected in the increase in the standard deviation of Table 1.
- Fig. 5 Example of a frame with a more pronounced mist cover than in the previous example extracted from a video surveillance sequence, together with its gray level histogram Fig. 6. Result of applying the proposed method. Observe how both the improvement of the contrast and a reduction of the noise make it possible to discriminate the registration of the boat. The improvement of the image quality is reflected in the increase in the standard deviation of Table 1 [16].
- Fig. 7. Sequence of the multi-temporal images corresponding to the area of the Mississippi Delta, from the MODIS satellite and provided by the Marine Remote Sensing of the University of South Florida.
- Fig. 8. Result of the elimination of the cloud cover.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to an automatic method for improving images, based on eliminating spatially varying degradation caused by adverse weather or environmental conditions.
Description
Método para Ia mejora automática de imágenes y secuencias con degradación espacialmente variante. Method for the automatic improvement of images and sequences with spatially variant degradation.
SECTOR DE LA TÉCNICASECTOR OF THE TECHNIQUE
Existe un conjunto de situaciones que en Ia práctica pueden dificultar tanto Ia identificación como el reconocimiento de objetos en imágenes, debido a Ia presencia de degradaciones espacialmente variantes. Ejemplos de tales situaciones se presentan tanto en percepción remota como en tareas de video-vigilancia, en donde Ia presencia de neblina, aerosoles, cubierta de nubes, etc. puede dificultar enormemente las tareas de análisis de dichas imágenes. El método aquí propuesto permite obtener una muy buena discriminación de los objetos de interés, al eliminar en las imágenes los tipos de degradaciones descritos anteriormente y que son de difícil tratamiento mediante los métodos convencionales de procesado de imágenes. La mayoría de estos métodos se basan en Ia hipótesis de que las degradaciones presentes son del tipo espacialmente invariante, Io que en Ia práctica habitualmente no se verifica. El método aquí propuesto se basa en Ia extracción de una medida local de anisotropía en las imágenes. Dicha medida facilita Ia eliminación de las degradaciones espacialmente variantes presentes en las imágenes, tales como neblina, cubierta de nubes, navegación submarina, etc. Por otro lado, el método aquí descrito puede tener aplicaciones en otros ámbitos de interés, como son las imágenes biomédicas, y más concretamente los sistemas de exploración del fondo de ojo en presencia de cataratas en el cristalino, Io que dificulta Ia observación de Ia retina en dichos casos. El método propuesto permitiría obtener una mejora de las imágenes adquiridas al reducir el efecto producido por Ia turbidez del cristalino.
ESTADO DE LA TÉCNICAThere is a set of situations that in practice can hinder both the identification and recognition of objects in images, due to the presence of spatially varying degradations. Examples of such situations occur both in remote sensing and in video surveillance tasks, where the presence of fog, aerosols, cloud cover, etc. It can make the analysis of these images very difficult. The method proposed here allows to obtain a very good discrimination of the objects of interest, by eliminating in the images the types of degradations described above and which are difficult to treat by conventional methods of image processing. Most of these methods are based on the hypothesis that the present degradations are of the spatially invariant type, which in practice is usually not verified. The method proposed here is based on the extraction of a local measure of anisotropy in the images. Said measure facilitates the elimination of spatially degraded variations present in the images, such as fog, cloud cover, underwater navigation, etc. On the other hand, the method described here may have applications in other areas of interest, such as biomedical images, and more specifically the fundus scanning systems in the presence of cataracts in the lens, which makes it difficult to observe the retina in such cases. The proposed method would allow obtaining an improvement of the acquired images by reducing the effect produced by the turbidity of the lens. STATE OF THE TECHNIQUE
Cuando las tareas de restauración de imágenes se llevan a cabo de modo ciego, es decir sin un conocimiento previo del proceso de degradación, dichos métodos se denominan métodos de deconvolución ciega [1]. En el caso de que el desenfoque no esté uniformemente distribuido sobre Ia imagen, dicha degradación afecta a las diferentes regiones de una misma imagen de modo diferente. Dicho escenario se denomina desenfoque espacialmente variante [2]. Un caso particular de degradación espacialmente variante se presenta cuando se dispone de imágenes multi- focales, multi-temporales o multi-sensoriales, incluyendo el caso de secuencias de video. En todos estos casos son aplicables diversas técnicas de fusión de imágenes [3-4]. En el área de Ia eliminación de degradaciones del tipo de neblina anteriormente descrito, se han propuesto diversos métodos, incluyendo métodos adaptativos y métodos basados en ondículas [5-8].When image restoration tasks are carried out blindly, that is, without prior knowledge of the degradation process, these methods are called blind deconvolution methods [1]. In the event that the blur is not uniformly distributed over the image, said degradation affects the different regions of the same image differently. This scenario is called spatially variant blurring [2]. A particular case of spatially variant degradation occurs when multi-focal, multi-temporal or multi-sensory images are available, including the case of video sequences. In all these cases, various image fusion techniques are applicable [3-4]. In the area of elimination of degradations of the type of mist described above, various methods have been proposed, including adaptive methods and methods based on wavelets [5-8].
El procedimiento de invención que aquí se propone consiste en un nuevo procedimiento de fusión entre los distintos fotogramas de Ia secuencia de entrada, que permite resolver el problema de Ia eliminación de las degradaciones originadas por Ia presencia de neblinas, aerosoles, etc. Las ventajas del procedimiento que aquí se propone son:The invention procedure proposed here consists of a new fusion procedure between the different frames of the input sequence, which allows solving the problem of eliminating the degradations caused by the presence of mists, aerosols, etc. The advantages of the procedure proposed here are:
- Su bajo coste computacional, comparado con otros métodos que trabajan con distribuciones espacio-frecuencia para Ia representación de Ia información contenida en las imágenes.- Its low computational cost, compared to other methods that work with space-frequency distributions for the representation of the information contained in the images.
- Proporciona, implícitamente, Ia definición de una medida cuantitativa de Ia mejora que se obtiene.- Provides, implicitly, the definition of a quantitative measure of the improvement obtained.
Por otra parte, en el Anexo I se presentan varias patentes de invención que se encuentran relacionadas con Ia presente propuesta.On the other hand, in Annex I there are several invention patents that are related to this proposal.
Referencias bibliográficas: [1] D. Kundur and D. Hatzinakos, "Blind Image Deconvolution." IEEE Signal Processing Magazine, 13(3), 1996, pp. 43-64.
[2] A. N. Rajagopalan and S. Chaudhuri, "Space-Variant Approaches toReferences: [1] D. Kundur and D. Hatzinakos, "Blind Image Deconvolution." IEEE Signal Processing Magazine, 13 (3), 1996, pp. 43-64. [2] AN Rajagopalan and S. Chaudhuri, "Space-Variant Approaches to
Recovery of Depth from Defocused Images." Computer Vision and ImageRecovery of Depth from Defocused Images. "Computer Vision and Image
Understanding, VoI. 68, No. 3, 1997, pp. 309-329.Understanding, VoI. 68, No. 3, 1997, pp. 309-329.
[3] D. Kundur, D. Hatzinakos, and H. Leung, "A Novel Approach to Multispectral Blind Image Fusión." In B. V. Dasarathy, editor, Sensor[3] D. Kundur, D. Hatzinakos, and H. Leung, "A Novel Approach to Multispectral Blind Image Fusion." In B. V. Dasarathy, editor, Sensor
Fusión: Architectures, Algorithms, and Applications, Proceedings of SPIE,Fusion: Architectures, Algorithms, and Applications, Proceedings of SPIE,
VoI. 3067, 1997, pp. 83-93.VoI 3067, 1997, pp. 83-93.
[4] Z. Zhang and R. S. Blum, "A categorization of multiscale-decomposition- based image fusión schemes with a performance study for a digital camera application", Proc. IEEE, 87, 1999, pp. 1315-1328[4] Z. Zhang and R. S. Blum, "A categorization of multiscale-decomposition- based image fusion schemes with a performance study for a digital camera application", Proc. IEEE, 87, 1999, pp. 1315-1328
[5] Y. Du, B. Guindon and J. Cihlar, "Haze detection and removal in high resolution satellite images with wavelet analysis," IEEE Trans. Geosci.[5] Y. Du, B. Guindon and J. Cihlar, "Haze detection and removal in high resolution satellite images with wavelet analysis," IEEE Trans. Geosci
Remote Sensing, 40, 2002, pp. 210-217.Remote Sensing, 40, 2002, pp. 210-217.
[6] Y. Zhang, B. Guindon and J. Cihlar "Development of a robust haze detection algorithm: assessment using temporally invariant targets", in[6] Y. Zhang, B. Guindon and J. Cihlar "Development of a robust haze detection algorithm: assessment using temporally invariant targets", in
Proc. IGARSS Toronto, ON, Canadá, June 24-28, 2002Proc. IGARSS Toronto, ON, Canada, June 24-28, 2002
[7] Y. Zhang and B. Guindon, "Quantitative assessment of a haze suppression methodology for satellite imagery: effect on land cover classification performance", IEEE Trans. On Geoscience and Remote Sensing, 41 , 2003, pp. 1082-1089[7] Y. Zhang and B. Guindon, "Quantitative assessment of a haze suppression methodology for satellite imagery: effect on land cover classification performance", IEEE Trans. On Geoscience and Remote Sensing, 41, 2003, pp. 1082-1089
[8] R. Richter, "Atmospheric correction of satellite data with haze removal including a haze/clear transition región", Comput. Geosci, 22, 1996, pp.[8] R. Richter, "Atmospheric correction of satellite data with haze removal including a haze / clear transition region", Comput. Geosci, 22, 1996, pp.
675-681675-681
[9] E. Wigner, "On the Quantum Correction for Thermodynamic Equilibrium." Physical Review, VoI. 40, 749-759 (1932).[9] E. Wigner, "On the Quantum Correction for Thermodynamic Equilibrium." Physical Review, VoI. 40, 749-759 (1932).
[10] T. A. C. M. Claasen and W. F. G. Mecklenbráuker, "The Wigner[10] T. A. C. M. Claasen and W. F. G. Mecklenbráuker, "The Wigner
Distribution - A Tool for Time-Frequency Analysis," Parts l-lll. Philips J.Distribution - A Tool for Time-Frequency Analysis, "Parts l-lll. Philips J.
Research, VoI. 35, 217-250, 276-300, 372-389 (1980).Research, VoI. 35, 217-250, 276-300, 372-389 (1980).
[11] K. H. Brenner, "A Discrete Versión of the Wigner Distribution Function," Proc. EURASIP, Signal Processing II: Theories and[11] K. H. Brenner, "A Discrete Version of the Wigner Distribution Function," Proc. EURASIP, Signal Processing II: Theories and
Applications, 307-309 (1983).
[12] L. Galleani, L. Cohén, G. Cristóbal and B. Suter, "Generation and denoising of images with clouds," SPIE, 8-10 JuIy 2002, Seattle, Washington, USA.Applications, 307-309 (1983). [12] L. Galleani, L. Cohen, G. Cristóbal and B. Suter, "Generation and denoising of images with clouds," SPIE, 8-10 JuY 2002, Seattle, Washington, USA.
[13] C. E. Shannon and W. Weaver. The Mathematical Theory of Communication. The University of Illinois Press, Urbana, Chicago, London, 1949.[13] C. E. Shannon and W. Weaver. The Mathematical Theory of Communication. The University of Illinois Press, Urbana, Chicago, London, 1949.
[14] Alfréd Rényi. "Some fundamental questions of information theory". In Pal Turan, editor, Selected Papers of Alfréd Rényi, volume 3, pp. 526-552. Akadémiai Kiadó, Budapest, 1976. (Originally: MTA III. Oszt. Kozl., 10, 1960, pp. 251-282).[14] Alfréd Rényi. "Some fundamental questions of information theory." In Pal Turan, editor, Selected Papers of Alfréd Rényi, volume 3, pp. 526-552. Akadémiai Kiadó, Budapest, 1976. (Originally: MTA III. Oszt. Kozl., 10, 1960, pp. 251-282).
[15] P. Flandrin, R. G. Baraniuk, O. Michel, "Time-frequency complexity and information", Proceedings of the ICASSP, vol. 3, 1994, pp. 329-332. [16] Y. Qu, Z. Pu, H. Zhao and Y. Zhao, "Comparison of different quality assessment functions in autoregulative illumination intensity algorithms", Optical Engineering, 45, 2006, pp. 117201[15] P. Flandrin, R. G. Baraniuk, O. Michel, "Time-frequency complexity and information", Proceedings of the ICASSP, vol. 3, 1994, pp. 329-332. [16] Y. Qu, Z. Pu, H. Zhao and Y. Zhao, "Comparison of different quality assessment functions in autoregulative illumination intensity algorithms", Optical Engineering, 45, 2006, pp. 117201
DESCRIPCIÓN DE LA INVENCIÓN Breve descripción de Ia invención.DESCRIPTION OF THE INVENTION Brief description of the invention.
El método propuesto en esta patente de invención consiste en Ia obtención de una mejora en Ia calidad de las imágenes proporcionadas por un dispositivo de captura, tal como una cámara CCD convencional. Dicho procedimiento está caracterizado en su fase de análisis por el empleo de una distribución espacio-frecuencia para las imágenes. Posteriormente se aplica un algoritmo inédito de fusión de imágenes, basado en una medida de anisotropía que permite Ia obtención de una imagen mejorada. La utilidad del método propuesto se extiende sobre aplicaciones de video- vigilancia, seguridad, navegación, percepción remota y biomedicina.
Descripción Detallada de Ia invención.The method proposed in this invention patent consists in obtaining an improvement in the quality of the images provided by a capture device, such as a conventional CCD camera. This procedure is characterized in its analysis phase by the use of a space-frequency distribution for the images. Subsequently, an unpublished image fusion algorithm is applied, based on a measure of anisotropy that allows obtaining an improved image. The usefulness of the proposed method extends to video surveillance, security, navigation, remote sensing and biomedicine applications. Detailed Description of the invention.
El método propuesto en esta patente de invención consta de cuatro etapas: - Obtención de Ia secuencia de las imágenes de video.The method proposed in this invention patent consists of four stages: - Obtaining the sequence of the video images.
- Pre-procesado de Ia secuencia de las imágenes de video.- Pre-processed of the sequence of the video images.
- Cálculo de Ia anisotropía de las imágenes.- Calculation of the anisotropy of the images.
- Fusión de las imágenes.- Fusion of images.
Obtención de Ia secuencia de las imágenes de video.Obtaining the sequence of the video images.
Se consideran secuencias de video obtenidas y almacenadas por procedimientos habituales en Ia tecnología de imagen digital. Las secuencias deben presentarse subdivididas en fotogramas para su procesado y análisis fotograma a fotograma.Video sequences obtained and stored by usual procedures in digital imaging technology are considered. The sequences must be subdivided into frames for processing and analysis frame by frame.
Pre-procesado de Ia secuencia de imágenes de video. La información frecuencial de una señal se puede obtener asociando a una determinada posición n de Ia señal un vector con los valores discretos proporcionados por Ia pseudo-distribución de Wigner (PWD). La aproximación discreta de Ia distribución de Wigner [9] utilizada ha sido Ia propuesta en [10], similar a Ia expresión de Brenner [11]:Pre-processed of the sequence of video images. The frequency information of a signal can be obtained by associating a vector with a discrete position n of the signal with the discrete values provided by the Wigner pseudo-distribution (PWD). The discrete approximation of the Wigner distribution [9] used has been the one proposed in [10], similar to Brenner's expression [11]:
(1 ) donde z* es el complejo conjugado de Ia señal z, m y k representan las variables discretas del tiempo y Ia frecuencia, respectivamente, y W[n, k] es una matriz en donde cada fila es un vector que representa el valor de Ia PWD en el pixel n, para los distintos valores de frecuencia espacial k. Esta
expresión se puede interpretar como Ia transformada discreta de Fourier (DFT) del producto r[n, m] = z[n + m] z*[ n - m], y está limitada al intervalo de vecindad [-Λ//2, Λ//2 - 1] del píxel n. La PWD presenta coeficientes con diferente magnitud para cada posición n, debido a cambios en los valores de Ia señal según Ia variable espacial.(1) where z * is the conjugate complex of the signal z, myk represent the discrete variables of time and frequency, respectively, and W [n, k] is a matrix where each row is a vector that represents the value of PWD in pixel n, for the different values of spatial frequency k. This expression can be interpreted as the discrete Fourier transform (DFT) of the product r [n, m] = z [n + m] z * [n - m], and is limited to the neighborhood interval [-Λ // 2, Λ // 2 - 1] of pixel n. The PWD presents coefficients with different magnitude for each position n, due to changes in the values of the signal according to the spatial variable.
En Ia presente patente de invención se describe un método de mejora de imágenes invariante a los cambios de luminosidad, basado en Ia caracterización del contenido en frecuencias de Ia neblina mediante un patrón PWD, discernible a partir del conocimiento únicamente de Ia propia imagen. Para ello, Ia secuencia de entrada que presenta Ia degradación espacialmente variante se procesa píxel a píxel, dando lugar a una imagen mejorada en Ia cual se ha suprimido Ia neblina. De este modo los pixeles que presentan degradación por neblina son eliminados y sustituidos por otros estimados a partir de las imágenes de entrada. El procedimiento propuesto se basa en Ia utilización de una implementación 1 D de Ia función de distribución de Wigner indicada en Ia Eq. (1 ). Ello da lugar a que para cada píxel de Ia imagen se pueda asociar un vector 1 D con el mismo número de componentes que los valores que se emplearon en su cálculo. Dicho vector se puede expresar a través de:In the present invention patent a method of image improvement invariant to the changes in brightness is described, based on the characterization of the content in frequencies of the mist by a PWD pattern, discernible from the knowledge only of the image itself. For this, the input sequence that presents the spatially variant degradation is processed pixel by pixel, resulting in an improved image in which the mist has been suppressed. In this way the pixels that show mist degradation are eliminated and replaced by others estimated from the input images. The proposed procedure is based on the use of a 1 D implementation of the Wigner distribution function indicated in Eq. (one ). This results in that for each pixel of the image a 1 D vector can be associated with the same number of components as the values that were used in its calculation. Said vector can be expressed through:
PWD[x,y] = w = [w_N/2, ... , wm_¡\PWD [x, y] = w = [w_ N / 2 , ..., w m _¡ \
(2)(2)
Para el cálculo de este vector se utiliza una ventana de datos unidimensional sobre Ia imagen, a Ia que se Ie puede dar Ia orientación que se desee, siendo esta Ia base teórica fundamental para medir Ia anisotropía de Ia imagen que se analiza, a nivel de píxel.For the calculation of this vector a one-dimensional data window is used on the image, to which the desired orientation can be given, this being the fundamental theoretical basis for measuring the anisotropy of the image that is analyzed, at the level of pixel
En Ia práctica, tanto en el caso de imágenes multi-temporales como de percepción remota se encuentran a menudo degradaciones mediante cubiertas de nubes o neblina, Io que dificulta su interpretación. Existen
estudios relacionados con Ia estadística de las nubes consideradas como ruido [12].In practice, in the case of multi-temporal images and remote sensing, degradations are often found through cloud coverings or fog, which makes interpretation difficult. exist studies related to the statistics of clouds considered as noise [12].
En el método propuesto, se realiza en primer lugar el cálculo de Ia PWD anteriormente descrita para cada píxel de Ia imagen. Para ello se aplica Ia versión 1 D de Ia PWD en ventanas deslizantes de tamaño Λ/=8 píxeles. De este modo, se obtiene simultáneamente Ia información espacio-frecuencial de las imágenes. En el caso de que Ia secuencia de entrada corresponda a una secuencia multi-temporal, con presencia de neblina, es posible utilizar las PWD de puntos homólogos, con distintas orientaciones de las ventanas sobre Ia imagen, para realizar una medida de anisotropía, a través de Ia cual es posible obtener una imagen libre de degradación. Básicamente, los pixeles correspondientes a Ia neblina son sustituidos por píxeles libres de neblina, a partir de Ia utilización de los fotogramas de Ia secuencia de entrada y tras aplicar un procedimiento de fusión entre las mismas.In the proposed method, the calculation of the PWD described above for each pixel of the image is performed first. For this, the 1 D version of the PWD is applied in sliding windows of size Λ / = 8 pixels. In this way, the space-frequency information of the images is obtained simultaneously. In the event that the input sequence corresponds to a multi-temporal sequence, with the presence of mist, it is possible to use the PWD of homologous points, with different orientations of the windows on the image, to perform an anisotropy measurement, through from which it is possible to obtain a degradation free image. Basically, the pixels corresponding to the fog are replaced by fog-free pixels, from the use of the frames of the input sequence and after applying a fusion procedure between them.
Cálculo de Ia anisotropía de las imágenes.Calculation of the anisotropy of the images.
Como ya se ha indicado, Ia PWD presenta coeficientes con diferente magnitud para cada posición n, debido a cambios en los valores de Ia señal según su posición. Una forma de cuantificar estas diferencias entre las PWD es mediante una medida realizada en cada posición n, para Io que puede utilizarse Ia entropía de Rényi de dicha PWD local.As already indicated, the PWD presents coefficients with different magnitude for each position n, due to changes in the values of the signal according to its position. One way to quantify these differences between the PWD is by means of a measurement made in each position n, for which the Rényi entropy of said local PWD can be used.
La medida de Ia entropía fue inicialmente propuesta por Shannon [13] como una medida del contenido de información por símbolo, a partir de una fuente de información estocástica. Posteriormente, Rényi [14] extendió esta noción introduciendo el concepto de entropía generalizada. La medida de Ia entropía de Rényi aplicada a una distribución espacio-frecuencia (E- F) tiene Ia forma:
The measure of entropy was initially proposed by Shannon [13] as a measure of the information content per symbol, from a stochastic information source. Subsequently, Rényi [14] extended this notion by introducing the concept of generalized entropy. The measure of Rényi's entropy applied to a space-frequency distribution (E-F) has the form:
(3) donde n es Ia variable espacial discreta, k es Ia variable frecuencial discreta y a es un parámetro cuyo valor recomendado ha de ser igual o superior a 2 [15]. En Ia Eq. (3) P representa Ia PWD definida en Ia Eq. (1 ). Aunque las medidas de Rényi de las distribuciones E-F se parecen a Ia entropía original, no presentan las mismas propiedades que se derivan de Ia teoría clásica de Ia información.(3) where n is the discrete spatial variable, k is the discrete frequency variable and a is a parameter whose recommended value must be equal to or greater than 2 [15]. In the Eq. (3) P represents the PWD defined in Eq. (one ). Although Rényi's measurements of the E-F distributions resemble the original entropy, they do not have the same properties that derive from the classical theory of information.
Con el fin de adaptar los valores de una distribución como Ia PWD al caso de señales de energía unitaria, es necesario realizar algún tipo de normalización. La denominada normalización cuántica es Ia que ha demostrado ser, de forma experimental, Ia más adecuada. El procedimiento consiste en asociar Ia PWD de una determinada posición n con una función de distribución de probabilidad por medio de Ia expresión: P[n,k] = PWD[n,k]PWD* [n,k]In order to adapt the values of a distribution such as PWD to the case of unit energy signals, it is necessary to perform some kind of normalization. The so-called quantum normalization is the one that has proven to be, in an experimental way, the most appropriate. The procedure consists in associating the PWD of a certain position n with a probability distribution function by means of the expression: P [n, k] = PWD [n, k] PWD * [n, k]
(4) junto con un paso de normalización para conseguir que Ia condición(4) together with a normalization step to achieve that the condition
∑∑P[n,k] = l se satisfaga. n k∑∑P [n, k] = l is satisfied. n k
Sustituyendo (4) en Ia ecuación (3) para a = 3 , resultaSubstituting (4) in equation (3) for a = 3, it turns out
(5)(5)
Esta medida puede ser interpretada a nivel de pixel por medio de:
This measurement can be interpreted at the pixel level by means of:
(6)(6)
El término P tiene que normalizarse, a nivel de píxel, de Ia siguiente forma:The term P has to be normalized, at the pixel level, as follows:
Q[n,k] = PWD[n,k]PWD* [n,k]Q [n, k] = PWD [n, k] PWD * [n, k]
(8) con objeto de que se verifique Ia condición de normalización,
(8) in order to verify the normalization condition,
(9) en donde M representa el número de muestras a procesar y k representa(9) where M represents the number of samples to be processed and k represents
Ia variable frecuencial: -N 12 < k < N 12The frequency variable: -N 12 <k <N 12
La ecuación (6) proporciona un valor de entropía para cada píxel de Ia imagen. En base a ello, es posible definir figuras de mérito para cada píxel de Ia imagen dadas por R[n; θs] . Aquí θs e [θ] r θ2, ... , θs] se ha introducido para representar S orientaciones diferentes para medir Ia entropía, basada en otras tantas orientaciones de Ia PWD. De esta manera, se obtiene el valor esperado de Ia entropía como promedio de un determinado número de direcciones de análisis y como consecuencia de Io anterior un valor de anisotropía. En concreto, se eligieron las siguientes orientaciones: (0, 30, 60, 90, 120 y 150°). Finalmente se eligió Ia variabilidad (desviación estándar) de los valores de entropía a nivel de píxel como una medida de Ia anisotropía de las imágenes.
Así en el caso que R[n; θs] represente el valor de entropía para el píxel n de Ia imagen, medida según Ia dirección θs <≡ [θj, Q2, ... , θs] , Ia desviación estándar relativa a Ia posición n de Ia imagen, puede expresarse mediante:Equation (6) provides an entropy value for each pixel of the image. Based on this, it is possible to define figures of merit for each pixel of the image given by R [n; θ s ]. Here θ s e [θ ] r θ 2 , ..., θ s ] has been introduced to represent S different orientations to measure the entropy, based on as many orientations of the PWD. In this way, the expected value of the entropy is obtained as the average of a certain number of analysis directions and as a consequence of the foregoing an anisotropy value. Specifically, the following orientations were chosen: (0, 30, 60, 90, 120 and 150 °). Finally, the variability (standard deviation) of the entropy values at the pixel level was chosen as a measure of the anisotropy of the images. So in the case that R [n; θ s ] represent the entropy value for the pixel n of the image, measured according to the direction θ s <≡ [θ j , Q 2 , ..., θ s ], the standard deviation relative to the position n of the image , can be expressed by:
(10)(10)
Donde μ representa Ia media de los valores R[n; θs] definida mediante Ia expresión,Where μ represents the mean of the R [n; ] s ] defined by the expression,
El valor σ[n] dado por Ia Ec. (10) se define como un índice de anisotropía del píxel n y mide Ia calidad de Ia información local de ese píxel. Promediado sobre Ia imagen permite definir una medida de Ia calidad global de Ia imagen que se considere. Los valores R[n; θs]de entropía en el píxel n constituyen un vector de dimensión S, a los que se puede añadir un elemento componente más, R[n; θs+1] = λz[n] , que proporciona información sobre el nivel de gris z[n] del píxel considerado, ponderado mediante un coeficiente λ a determinar de forma experimental. Esto permite al algoritmo ser más selectivo respecto al "color" del ruido. El valor λ de viene dado por Ia siguiente expresión:
The value σ [n] given by Eq. (10) is defined as an anisotropy index of the pixel n and measures the quality of the local information of that pixel. Averaged over the image allows defining a measure of the overall quality of the image considered. R values [n; θ s ] of entropy in pixel n constitute a vector of dimension S, to which one more component element, R [n; θ s + 1 ] = λz [n], which provides information on the gray level z [n] of the pixel considered, weighted by a coefficient λ to be determined experimentally. This allows the algorithm to be more selective about the "color" of the noise. The value λ of is given by the following expression:
(12) En donde A <≡ [0, 1] determina Ia importancia de Ia neblina, n es el tamaño de Ia ventana utilizada en el cálculo de Ia entropía, B es el valor del nivel
de gris más característico de Ia neblina y C es el máximo observado en Ia imagen z.(12) Where A <≡ [0, 1] determines the importance of the mist, n is the size of the window used in the calculation of the entropy, B is the value of the level gray more characteristic of the fog and C is the maximum observed in the z image.
Fusión de las imágenes En definitiva, el método propuesto se basa en el hecho de que cada píxel de Ia imagen presenta diferentes valores de anisotropía, dados por Ia expresión (10). De ese modo, es posible definir un algoritmo de fusión de una secuencia de T imágenes mediante Ia expresión:Fusion of the images In short, the proposed method is based on the fact that each pixel of the image has different anisotropy values, given by the expression (10). In this way, it is possible to define a fusion algorithm of a sequence of T images by means of the expression:
(13)(13)
Aquí, DM representa un mapa de decisión que indica Ia imagen t de Ia secuencia de entrada de Ia que se toma el valor del píxel n para constituir Ia imagen resultante. Este mapa permite construir una imagen mejorada a partir de Ia fusión de Ia secuencia de entrada mediante Ia selección de los píxeles de mayor calidad, según un criterio de máxima anisotropía.Here, DM represents a decision map that indicates the image t of the input sequence from which the value of the pixel n is taken to constitute the resulting image. This map allows an improved image to be constructed from the fusion of the input sequence by means of the selection of the highest quality pixels, according to a criterion of maximum anisotropy.
La presente técnica puede ser utilizada en imágenes en color sin más que aplicar el procedimiento aquí descrito a cada uno de los canales de color p.e. RGB (rojo, verde y azul) o en cualquier otro espacio de color, si bien los mejores resultados se obtendrán en el canal de luminancia que es donde mayor grado de discriminación puede obtenerse.The present technique can be used in color images by simply applying the procedure described here to each of the color channels e.g. RGB (red, green and blue) or in any other color space, although the best results will be obtained in the luminance channel which is where the highest degree of discrimination can be obtained.
Del mismo modo Ia técnica descrita puede ser aplicada a secuencias de video (es decir una sucesión de fotogramas en el tiempo), aplicándola a un conjunto de fotogramas y desplazando éstos sucesivamente en el tiempo.In the same way, the described technique can be applied to video sequences (that is, a succession of frames in time), applying it to a set of frames and moving them successively in time.
La técnica descrita aquí puede ser aplicada en oftalmología para Ia exploración del fondo de ojo con presencia de cataratas, ya que éstas introducen una turbidez en el cristalino que puede ser modelada de un
modo análogo al de Ia cubierta de nubes o neblina en el caso de imágenes en percepción remota. Para ello es preciso poder contar con un conjunto de fotogramas y de ese modo poder aplicar Ia técnica de fusión desarrollada.The technique described here can be applied in ophthalmology for the exploration of the fundus with the presence of cataracts, since these introduce a turbidity in the lens that can be modeled from a analogous to that of the cloud cover or fog in the case of images in remote perception. For this, it is necessary to have a set of frames and thus be able to apply the developed fusion technique.
Por último cabe señalar que para poder obtener los mejores resultados es preferible procesar imágenes sin comprimir. En el caso de que éstos hayan sido comprimidos con p.e. el método conocido bajo Ia denominación JPEG, Ia tasa de compresión debería de ser baja y no debería superar 80:100 para que el proceso de fusión pueda proporcionar resultados satisfactorios.Finally, it should be noted that in order to obtain the best results, it is preferable to process uncompressed images. In the event that these have been compressed with e.g. The method known under the name JPEG, the compression rate should be low and should not exceed 80: 100 so that the fusion process can provide satisfactory results.
EJEMPLOS DE LA REALIZACIÓN DE LA INVENCIÓNEXAMPLES OF THE EMBODIMENT OF THE INVENTION
Se presentan aquí, a modo de ejemplo, tres casos de utilización práctica de Ia técnica desarrollada en el caso especialmente concreto de secuencias de video-vigilancia costera y percepción remota. En efecto, el primer escenario consiste en Ia mejora y supresión de ruido en imágenes de video-vigilancia costera (Figuras 2-4). En el segundo escenario Ia presencia de Ia neblina es aún más severa que en el caso anterior (Fig. 5). Sin embargo como resultado de aplicar el método descrito, es posible efectuar una buena discriminación de los caracteres de Ia matrícula de Ia embarcación (Fig. 6). Debemos hacer énfasis en mejora obtenida en Ia calidad de imágenes en comparación con otros métodos ya existentes que no consideran un modelado del ruido existente y que se basan en Ia ecualización del histograma (Figuras. 2-3). El tercer escenario consiste en una secuencia de imágenes multi-temporales representadas en Ia Figura 7. La Figura 8 presenta el resultado de Ia aplicación del método presentado en donde se observa una muy significativa eliminación de Ia cubierta de nubes.
DESCRIPCIÓN DE LAS FIGURAS.We present here, by way of example, three cases of practical use of the technique developed in the especially specific case of coastal video surveillance sequences and remote sensing. In fact, the first scenario consists of the improvement and suppression of noise in coastal video surveillance images (Figures 2-4). In the second scenario, the presence of the mist is even more severe than in the previous case (Fig. 5). However, as a result of applying the described method, it is possible to make a good discrimination of the characters of the registration of the vessel (Fig. 6). We must emphasize the improvement obtained in the quality of images compared to other existing methods that do not consider a modeling of the existing noise and that are based on the equalization of the histogram (Figures 2-3). The third scenario consists of a sequence of multi-temporal images represented in Figure 7. Figure 8 presents the result of the application of the method presented in which a very significant elimination of the cloud cover is observed. DESCRIPTION OF THE FIGURES.
Fig.1. Esquema general del método propuesto. Tras una etapa de registro previo, se procede a Ia extracción de Ia medida de anisotropía de cada una de ellas, para a continuación obtener Ia imagen mejorada tras un proceso de fusión.Fig. 1. General scheme of the proposed method. After a previous registration stage, the anisotropy measurement of each of them is extracted, and then the image is improved after a fusion process.
Fig. 2. Ejemplo de un fotograma con neblina extraído de una secuencia de video-vigilancia, junto con su histograma de niveles de grises Fig. 3. Resultado de efectuar una ecualización del histograma. Obsérvese que dicha operación no produce una mejora significativa, sino que por el contrario un empeoramiento del contraste sin una reducción del ruido Fig. 4 Resultado de aplicar el método propuesto. Obsérvese como se produce una mejora del contraste así como una reducción del ruido. La mejora de Ia calidad de Ia imagen queda reflejada en el incremento de Ia desviación estándar de Ia Tabla 1.Fig. 2. Example of a foggy frame taken from a video surveillance sequence, together with its gray level histogram Fig. 3. Result of performing a histogram equalization. Note that this operation does not produce a significant improvement, but instead a worsening of the contrast without a noise reduction Fig. 4 Result of applying the proposed method. Note how there is an improvement in contrast as well as a reduction in noise. The improvement of the image quality is reflected in the increase in the standard deviation of Table 1.
Fig. 5 Ejemplo de un fotograma con una cubierta de neblina más acusada que en el ejemplo anterior extraído de una secuencia de video-vigilancia, junto con su histograma de niveles de grises Fig. 6. Resultado de aplicar el método propuesto. Obsérvese como tanto Ia mejora del contraste como una reducción del ruido permiten discriminar Ia matrícula de Ia embarcación. La mejora de Ia calidad de Ia imagen queda reflejada en el incremento de Ia desviación estándar de Ia Tabla 1 [16]. Fig. 7. Secuencia del imágenes multi-temporales correspondientes al área del delta del Mississippi, procedentes del satélite MODIS y proporcionadas por el Marine Remote Sensing of the University of South Florida. Fig. 8. Resultado de Ia eliminación de Ia cubierta de nubes.
Fig. 5 Example of a frame with a more pronounced mist cover than in the previous example extracted from a video surveillance sequence, together with its gray level histogram Fig. 6. Result of applying the proposed method. Observe how both the improvement of the contrast and a reduction of the noise make it possible to discriminate the registration of the boat. The improvement of the image quality is reflected in the increase in the standard deviation of Table 1 [16]. Fig. 7. Sequence of the multi-temporal images corresponding to the area of the Mississippi Delta, from the MODIS satellite and provided by the Marine Remote Sensing of the University of South Florida. Fig. 8. Result of the elimination of the cloud cover.
Tabla 1
Table 1
Claims
1. Método automático de eliminación de degradaciones espacialmente variantes que aparezcan en imágenes o secuencias de imágenes debidas a condiciones meteorológicas adversas tales como neblina, niebla, imágenes aéreas con cubierta de nubes, etc., caracterizado porque consta de las siguientes etapas:1. Automatic method of eliminating spatially variant degradations that appear in images or sequences of images due to adverse weather conditions such as fog, fog, aerial images with cloud cover, etc., characterized in that it consists of the following stages:
a) Obtención de Ia secuencia de imágenes de video, b) Pre-procesado de Ia secuencia de imágenes de video, c) Cálculo de Ia anisotropia de las imágenes, d) Fusión de las imágenes.a) Obtaining the sequence of video images, b) Pre-processing of the sequence of video images, c) Calculation of the anisotropy of the images, d) Fusion of the images.
2. Método automático descrito en Ia reivindicación 1 , caracterizado porque en Ia etapa b), reivindicación 1 , se emplea Ia pseudo- distribución de Wigner, utilizando Ia expresión (1 ), implementación 1 D, de Ia Descripción Detallada en esta Memoria de Patente, patrón discernible a partir del conocimiento de Ia propia imagen obtenido en Ia etapa a).2. Automatic method described in claim 1, characterized in that in step b), claim 1, the Wigner pseudo-distribution is used, using the expression (1), 1 D implementation, of the Detailed Description in this Patent Report , discernible pattern from the knowledge of the image obtained in stage a).
3. Método automático descrito en Ia reivindicación 1 , caracterizado también porque en al etapa b), reivindicación 1 , se utiliza, en el cálculo de cada vector asociado con Ia pseudo-distribución de Wigner, una ventana de datos unidimensionales de Ia imagen, según Ia expresión (2).3. Automatic method described in claim 1, also characterized in that in step b), claim 1, a window of unidimensional data of the image is used, in the calculation of each vector associated with Wigner's pseudo-distribution, according to The expression (2).
4. Método automático descrito en Ia reivindicación 1 , caracterizado porque en Ia etapa c), reivindicación 1 , calcula Ia entropía de las imágenes mediante el concepto de entropía generalizada de Rényi aplicada a una distribución espacio-frecuencia según Ia expresión4. Automatic method described in claim 1, characterized in that in step c), claim 1, calculates the entropy of the images by the concept of generalized entropy of Rényi applied to a space-frequency distribution according to the expression
(3) de Ia Descripción Detallada de esta Mamoria de Patente. (3) of the Detailed Description of this Patent Mammal.
5. Método automático descrito en Ia reivindicación 1 , caracterizado porque en Ia etapa d), reivindicación 1 , realiza Ia fusión de las imágenes de cada elemento de imagen (pixel) construyendo una imagen mejorada por medio de Ia selección de los elementos de mayor calidad, según un criterio de máxima entropía.5. Automatic method described in claim 1, characterized in that in step d), claim 1, performs the fusion of the images of each image element (pixel) by constructing an improved image by means of the selection of the highest quality elements , according to a criterion of maximum entropy.
6. Método automático descrito en las reivindicaciones 1 , 2, 3, 4 y 5 caracterizado porque permite su aplicación a cada uno de los fotogramas de una secuencia de video.6. Automatic method described in claims 1, 2, 3, 4 and 5 characterized in that it allows its application to each of the frames of a video sequence.
7. Método automático descrito en las reivindicaciones 1 , 2, 3, 4, 5 y 6 caracterizado porque permite el tratamiento de imágenes de interés, como por ejemplo Ia detección de los caracteres de una matrícula de barco, en video-vigilancia marítima.7. Automatic method described in claims 1, 2, 3, 4, 5 and 6 characterized in that it allows the treatment of images of interest, such as for example the detection of the characters of a ship's license plate, in maritime video surveillance.
8. Método automático descrito en las reivindicaciones 1 , 2, 3, 4, 5, 6 y 7, caracterizado porque es aplicable a señales dotadas de color.8. Automatic method described in claims 1, 2, 3, 4, 5, 6 and 7, characterized in that it is applicable to signals endowed with color.
9. Método automático descrito en las reivindicaciones 1 , 2, 3, 4, 5, 6, 7 y 8, caracterizado por ser extensible a fotogramas o secuencias de video que hayan sido comprimidas previamente, siempre que Ia tasa de compresión no sea superior a 80:100.9. Automatic method described in claims 1, 2, 3, 4, 5, 6, 7 and 8, characterized in that it is extensible to frames or video sequences that have been previously compressed, provided that the compression rate does not exceed 80: 100
10. Método automático descrito en las reivindicaciones 1 , 2, 3, 4, 5, 6, 7, 8 y 9, caracterizado por ser extensible a Ia exploración, o examen, de imágenes de fondo de ojo cuando Ia existencia de cataratas impide Ia visión correcta. 10. Automatic method described in claims 1, 2, 3, 4, 5, 6, 7, 8 and 9, characterized by being extensible to the exploration, or examination, of fundus images when the existence of cataracts prevents Ia correct vision
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ESP200701574 | 2007-06-07 | ||
ES200701574A ES2310136B1 (en) | 2007-06-07 | 2007-06-07 | METHOD FOR AUTOMATIC IMPROVEMENT OF IMAGES AND SEQUENCES WITH SPACIALLY VARIANT DEGRADATION. |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008148924A1 true WO2008148924A1 (en) | 2008-12-11 |
Family
ID=40084596
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/ES2008/070106 WO2008148924A1 (en) | 2007-06-07 | 2008-05-29 | Method for automatically improving images and sequences with spatially varying degradation |
Country Status (2)
Country | Link |
---|---|
ES (1) | ES2310136B1 (en) |
WO (1) | WO2008148924A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999038121A1 (en) * | 1998-01-27 | 1999-07-29 | Sensar, Inc. | Method and apparatus for removal of bright or dark spots by the fusion of multiple images |
US20030058405A1 (en) * | 2001-07-23 | 2003-03-27 | Cornsweet Tom N. | Instruments and methods for examining and quantifying cataracts |
US20040153284A1 (en) * | 2003-01-31 | 2004-08-05 | Bernstein Lawrence S. | Method for performing automated in-scene based atmospheric compensation for multi-and hyperspectral imaging sensors in the solar reflective spectral region |
WO2007042074A1 (en) * | 2005-10-12 | 2007-04-19 | Active Optics Pty Limited | Method of forming an image based on a plurality of image frames, image processing system and digital camera |
-
2007
- 2007-06-07 ES ES200701574A patent/ES2310136B1/en not_active Expired - Fee Related
-
2008
- 2008-05-29 WO PCT/ES2008/070106 patent/WO2008148924A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999038121A1 (en) * | 1998-01-27 | 1999-07-29 | Sensar, Inc. | Method and apparatus for removal of bright or dark spots by the fusion of multiple images |
US20030058405A1 (en) * | 2001-07-23 | 2003-03-27 | Cornsweet Tom N. | Instruments and methods for examining and quantifying cataracts |
US20040153284A1 (en) * | 2003-01-31 | 2004-08-05 | Bernstein Lawrence S. | Method for performing automated in-scene based atmospheric compensation for multi-and hyperspectral imaging sensors in the solar reflective spectral region |
WO2007042074A1 (en) * | 2005-10-12 | 2007-04-19 | Active Optics Pty Limited | Method of forming an image based on a plurality of image frames, image processing system and digital camera |
Also Published As
Publication number | Publication date |
---|---|
ES2310136A1 (en) | 2008-12-16 |
ES2310136B1 (en) | 2009-11-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Azzeh et al. | Salt and pepper noise: Effects and removal | |
US9870600B2 (en) | Raw sensor image and video de-hazing and atmospheric light analysis methods and systems | |
Rajkumar et al. | A comparative analysis on image quality assessment for real time satellite images | |
Chen et al. | Imaging sensor noise as digital x-ray for revealing forgeries | |
JP2014515587A (en) | Learning image processing pipelines for digital imaging devices | |
Mure-Dubois et al. | Real-time scattering compensation for time-of-flight camera | |
KR101405435B1 (en) | Method and apparatus for blending high resolution image | |
CN108198198A (en) | Single frames infrared small target detection method based on wavelet transformation and Steerable filter | |
CN109598736B (en) | Registration method and device for depth image and color image | |
CA2989188A1 (en) | Method for ir or thermal image enchancement based on scene information for video analysis | |
WO2020036782A2 (en) | Methods and systems for object recognition in low illumination conditions | |
Iwasokun et al. | Image enhancement methods: a review | |
KR101874738B1 (en) | Apparatus and method for generating hdr image from ldr image through image processing | |
Kim et al. | A novel framework for extremely low-light video enhancement | |
CN112966635A (en) | Low-resolution time sequence remote sensing image-oriented moving ship detection method and device | |
EP2811455A1 (en) | Method and apparatus for generating a noise profile of noise in an image sequence | |
CN116645580A (en) | A Method and Device for Infrared Weak and Small Target Detection Based on Spatial-Temporal Feature Differences | |
Park et al. | Fog-degraded image restoration using characteristics of RGB channel in single monocular image | |
Li et al. | Bionic vision-based synthetic aperture radar image edge detection method in non-subsampled contourlet transform domain | |
Saleem et al. | Survey on color image enhancement techniques using spatial filtering | |
Hung et al. | An utilization of edge detection in a modified bicubic interpolation used for frame enhancement in a camera-based traffic monitoring | |
WO2008148924A1 (en) | Method for automatically improving images and sequences with spatially varying degradation | |
Al Mudhafar et al. | Noise in digital image processing: A review study | |
WO2022120532A1 (en) | Presentation attack detection | |
Sasikala et al. | An adaptive edge detecting method for satellite imagery based on canny edge algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08787640 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08787640 Country of ref document: EP Kind code of ref document: A1 |