WO2001011864A2 - System and method for processing optically scanned documents - Google Patents
System and method for processing optically scanned documents Download PDFInfo
- Publication number
- WO2001011864A2 WO2001011864A2 PCT/US2000/022036 US0022036W WO0111864A2 WO 2001011864 A2 WO2001011864 A2 WO 2001011864A2 US 0022036 W US0022036 W US 0022036W WO 0111864 A2 WO0111864 A2 WO 0111864A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distortion
- image
- captured image
- capturing
- pixel
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 81
- 238000012545 processing Methods 0.000 title claims description 9
- 238000005286 illumination Methods 0.000 claims abstract description 40
- 230000004313 glare Effects 0.000 claims abstract description 28
- 230000006835 compression Effects 0.000 claims abstract description 20
- 238000007906 compression Methods 0.000 claims abstract description 20
- 238000012937 correction Methods 0.000 claims abstract description 19
- 239000003086 colorant Substances 0.000 claims abstract description 7
- 230000003287 optical effect Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 claims description 5
- 230000027455 binding Effects 0.000 description 13
- 238000009739 binding Methods 0.000 description 13
- 238000010408 sweeping Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000012015 optical character recognition Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 101100328884 Caenorhabditis elegans sqt-3 gene Proteins 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013016 damping Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000005357 flat glass Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000004020 luminiscence type Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/10—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces
- H04N1/1013—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces with sub-scanning by translatory movement of at least a part of the main-scanning components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/0402—Arrangements not specific to a particular one of the scanning methods covered by groups H04N1/04 - H04N1/207
- H04N2201/0434—Arrangements not specific to a particular one of the scanning methods covered by groups H04N1/04 - H04N1/207 specially adapted for scanning pages of a book
Definitions
- This invention relates to the field of document scanning and/or reproduction of documents with the aid of electro-optical devices. More specifically, the present invention addresses the problems of scanning an image of a document into a photocopier, a mopier, a digital copier, a line studio camera or scanner when the document cannot be laid completely flat. Description of the Related Technology
- Line studio cameras are cameras similar to photographic cameras which are well known in the art, and they move a CCD in the plane where regular cameras form the image exposed to the film.
- the paper image is illuminated and the CCD senses the reflected light.
- Digital cameras may also be used to capture images.
- Digital cameras use a CCD to capture all the light from the paper image at once.
- sensing of the reflected light can be captured in a snapshot sensing all the light from the paper at once as with a digital camera, or sensing the light from a point or a line at the time as with a line studio camera
- An example of this type of system is disclosed in a U S Patent No.
- This height function is then used to adjust mirrors, lens, and light sources so that the image reflects a constant illumination, stays in focus, and is scanned at a constant speed relative to the page While such a system may correct for the spatial distortion of a page being copied, such a system cannot correct for the spatial distortion of the page once an image of the page has been taken. Instead, the correction must be performed as the image is captured.
- Document restoration methods have been extensively applied to documents which are known to have been corrupted or have low resolution such as aerial photographs, old maps, archaeological images, et cetera.
- a method of non linear equalization of degraded document images, due to repeated reproduction or fax transmission, is described in U.S. Patent No. 5,745,597 to Agazzi, et al. to improve the performance of optical character recognition (OCR) systems.
- OCR optical character recognition
- Methods for distortion correction of scanned images based on analyzing text lines have been reported but these do not generalize to other objects such as lines, image edges and the like, nor consider applying these methods to the process of document reproduction.
- the main disadvantage of these methods is that they only correct lines of text, and they do not consider the distortion caused by changing object illumination or focus distortion.
- a method and apparatus is provided for reducing various problems that are caused when scanning a bound document that is not placed completely flat on a scanner platen.
- One aspect of the invention includes a method of correcting for distortion in a captured image of an object, the method comprising identifying one or more background sections of a captured image, generating a distortion coefficient based upon the background sections, and removing distortion from the captured image by use of the distortion coefficient
- Yet another aspect of the invention includes a method of removing distortion from a captured image of an object, the method comprising capturing an image of an object, and automatically removing, in response to the capturing, compression distortion from the captured image
- Yet another aspect of the invention includes a method of removing distortion from a captured image of a an object, the method comprising capturing an image of an object and automatically removing, in response to the capturing, illumination distortion from the at least a portion of the captured image
- Yet another aspect of the invention includes a method of removing distortion from a captured image, the method comprising capturing an image of an object, and automatically removing, in response to the capturing, glare from at least a portion of the captured image.
- Yet another aspect of the invention includes a method of removing distortion from a captured image of an object, the method comprising capturing an image of an object, and automatically removing, in response to the capturing, focus distortion from at least a portion of the captured image
- Yet another aspect of the invention includes a method of removing distortion from a captured image, the method comprising capturing an image of an object, and automatically removing, in response to the capturing, spatial distortion from the captured image.
- Yet another aspect of the invention includes a method of removing distortion from a captured image, the method comprising capturing an image of an object, automatically removing, in response to the capturing, compression distortion from at least a portion of the captured image, automatically removing, in response to the capturing, illumination distortion from at least a portion of captured image, automatically removing, in response to the capturing, glare from at least a portion of captured image, automatically removing, in response to the capturing, focus distortion from at least a portion of the captured image, and automatically removing, in response to the capturing, spatial distortion from at least a portion of the captured image
- a distortion correction apparatus comprising a pixel color and illumination analyzer adapted to analyze the color and illumination of each pixel of a distorted image, a distortion coefficient generator, in data communication with the pixel color and illumination analyzer, adapted to generate values representative of a distortion coefficient of the distorted image, and a transformer, in data communication with the distortion coefficient generator, adapted to transform the distorted image by use of
- Figure 1 is a block diagram of a system for scanning bound documents.
- Figure 2 is a flowchart illustrating a process of converting a captured image into a corrected image.
- Figure 3 is a flowchart illustrating in further detail the process of Figure 2 for removing distortion from a captured image.
- Figure 4 is an illustration of an exemplary 3 D histogram of an image that is captured by the system of Figure 1.
- Figure 5 illustrates a document image of with an enlargement of a pixel and the color numbers of the pixel that is part of a captured image
- Figure 6A, 6B, and 6C respectively illustrate a grey pixel, a pixel that approximates grey, and a non grey pixel.
- Figure 7 is an illustration of a captured image wherein a selected pixel in each column of the captured image has been identified as being proximate to a spine of a book.
- Figure 8 is an illustration of an exemplary image that has been captured by the system of Figure 1 , wherein background areas and foreground areas are identified.
- Figure 9 is a representational block diagram illustrating a background line and a foreground line of a captured image.
- Figure 10 is a histogram illustrating the gray intensities for the background, foreground separately, as well as for the background and foreground combined
- Figure 1 1 is a diagram illustrating a numeric circle that is used for determining a break point in an image that is captured by the system of Figure 1.
- Figure 12 illustrates a graphical user interface for viewing an image that is captured by the system of Figure 1.
- Figure 13 illustrates an image having spatial distortion.
- Figure 14 illustrates the image of Figure 13 subsequent to the removal of the spatial distortion.
- Figure 15 is a partial side elevational view of the book of Figure 1.
- Figures 16A illustrates an image that has compression distortion.
- Figure 16B illustrates the image of Figure 16A without compression distortion.
- Figure 17A illustrates an image that has illumination distortion
- Figure 17B illustrates the image of Figure 17A after illumination distortion has been removed.
- Figure 18A illustrates an image that has glare
- Figure 18B illustrates the image of Figure 18A wherein the glare has been removed.
- Figure 19 is a partial side elevational diagram of the system shown in Figure 1 , wherein the cause of glare is shown.
- Figure 20A illustrates an image that has focus distortion.
- Figure 20B illustrates the image of Figure 20A wherein the focus distortion has been removed.
- Figures 21 A and 21 B are diagrams illustrating how out of focus distortion occurs.
- Figure 22A illustrates a pixel that is focused
- Figure 22B illustrates a pixel that is out of focus
- Figure 23 illustrates a pivoting plane that is used to remove focus distortion, such as is show in Figure 20A and 22B.
- Figure 24 illustrates the characteristics of the pixels population being analyzed by the pivoting area.
- a scanning system 1 which is adapted to operate in a conventional document scan mode as well as a book copying mode
- the system 1 is placed beneath a transparent image platen 2, which is usually of glass or other rigid transparent material.
- a document 3 is shown placed on the platen 2, folded at the spine 4.
- the left break point 5 and the right break point 6 represents the points where the document 3 starts to move away from the platen 2.
- the scanning system 1 includes a radiation source such as a lamp 7, mounted within a reflector 8. Radiant energy or light from the lamp 7 is reflected upwardly through the platen 2 and irradiates the document 3 that is placed on the platen 2.
- the scanning system 1 also includes a scanning mirror 9 positioned to receive radiation reflected from the document 3.
- the lamp 7, the reflector 8, and mirror 9 may be mounted on a common housing 10.
- the reflected optical image data received on scanning mirror 9 is directed via a suitable mirror assembly 1 1 to an optical LRNS focusing assembly 12 that preferably is of a simple, fixed focus type.
- the optical focusing assembly 12 focuses the optical image information into a beam 13, which is directed onto a suitable photo sensor 14, which converts the optical image data of the beam 13 into electrical image data.
- Known types of photo-sensors can be used, such as a Charge-Coupled-Device (CCD).
- CCD Charge-Coupled-Device
- the electrical image data signal from CCD 14 can be converted by an analog to digital converter, if required, though many CCD systems now possess a digital output, before being sent to a computer for processing.
- the lamp 7, the reflector 8, the scanning mirror 9, the mirror assembly 1 1, the focusing assembly 12, and the CCD 14 can be commonly mounted in a scanning carriage 15.
- the carriage 15 is adapted to move from left to right beneath the platen 2 at pre-set or controllable scanning speeds.
- the scanning carriage 15 undergoes a pre-scan excursion to the left prior to initiation of the scan exposure cycle with initial acceleration (and vibration damping) taking place in the pre scan zone shown as PS
- the start of scan position is identified as point S braid.
- the end-of-scan position is identified as S,
- Figure 2 is a flowchart showing the steps for acquiring a less distorted image data with scanning system 1. Depending on the embodiment, certain steps may be removed and others may be added.
- the scanning system 1 is set up as shown in Figure 1. Starting at a state 100, an object, such as a book, is selected for copying using the system 1. Moving to a state 104, the image is captured using the scanning system 1.
- the scanning carriage 15 and the mirror assembly 1 1 are moved through from a start of scan position So to an end of scan position S1
- the document 3 is thus illuminated by lamp 7 and the reflected image is received by the mirror 9 that reflects a latent image of the document surface to the mirror assembly 1 1 and then to the lens assembly 12
- the lens assembly 12 focuses the latent image data into beam 13 that is directed onto a suitable photo-sensor 14
- the scanning system 1 converts the optical image data into electronic image data. Moving to a state 1 12, once the electronic image data is in the photo sensor 14, distortion in the image data is corrected.
- the distortion in the captured image is removed by a computer 17 that is connected to the system 100
- the system 100 incorporates in software or hardware the processes that are used to correct the image distortion
- the computer 17 and the system 1 may be connected to a display device that is used to verify the existence and type of distortion that is identified by the computer 17 and/or the system 1
- the process of correcting image distortion is described below in further detail with respect to Figure 3
- the corrected image may be then printed saved, or sent to a computer for further processing.
- Figure 3 is a flowchart illustrating a process of correcting certain distortion in a captured image. Depending on the embodiment, certain steps may be removed and others may be added.
- a 3 D histogram is created by taking the 2 D image electrical data and analyzing the luminescence of each pixel 20
- the 3 D histogram is a graphic representation of background areas Values of a pixel are in the range of 255 (bright) to 0 (dark)
- the 3 D histogram uses only the pixels in range 255 to 1 15 because pixels in letters are typically in the range 115 0 and cause noise
- Figure 5 illustrates the pixel values for an exemplary pixel With this information, it is possible to create a 2D histogram to find the distortion coefficient of the document (described below).
- the spine of the book is identified.
- the darkest color pixel 27 ( Figure 7) in each row is identified In one embodiment, the darkest pixel is that pixel having lowest color average number. Once the darkest pixel is identified, its location is recorded This process is repeated for each line 26, until the darkest color pixel 27 has been located and the column 28 that corresponds to where the darkest color pixel 27 is located has been detected The darkest pixels tend to fall within a selected column which is assumed to identify the spine of the book.
- the background area 24 is identified as the area where there is a lack of an important density of ink
- the foreground area 25 is identified as the area of the document 3 where most objects such as photographs, letters, lines etc are found
- Figures 8 and 9 each illustrate how background areas and foreground areas are differentiated.
- the lines that are candidates to be background areas are identified by analyzing each of the pixel colors of the captured image.
- Background lines 29 are those lines where the sum of gray pixels 21 and almost gray pixels 22 is larger than the sum of the non gray pixels 23.
- Foreground lines 30 are those where the sum of the non gray pixels is larger than the sum of gray pixels 21 and almost gray pixels 22.
- a gray pixel 21 is a pixel that has the same amount of blue, green and red.
- An almost grey pixel 22 is a pixel whose color values of blue, green and red, do not vary much one from the other.
- a non-gray pixel 23 is a pixel whose color values of blue, green and red vary greatly one from the other.
- Figures 6A, 6B, and 6C illustrate examples of the different types of pixels. For example, if a line that crosses a picture is analyzed, it has more non-gray pixels 23 than the sum of gray pixels 21 and almost gray pixels 22. In this case, the line is classified as foreground line 30. If a line that passes between two lines of text is analyzed, the sum of gray pixels 21 and almost gray pixels 22 is typically greater than the sum of the non-gray pixels. In this case, the line would be classified as a background line 29.
- a column histogram illustrating the grey average of each of the background lines is prepared.
- Figure 10 illustrates an exemplary histogram.
- a line 31 represents the points of the gray average of each column in the image of the document 3, including the background area 24 and the foreground area 25.
- the line 31 is usually very distorted because it contains a lot of noise.
- a line 32 represents the points of the gray average of each column of the background areas 24. As it is seen, the line 32 does not contain as much noise as line 31 and it provides a more uniform line that contains less variances.
- the line 32 is used for finding the break point 35 ( Figure 11 ). In one embodiment, the break point 35 is located by constructing a numeric circle 38.
- an average W is calculated.
- the number W is used to construct an interval (W-M, W + M) that determines the width (2 * M) and the center (W) of the circle.
- M is 7.
- the next data in straight line (N + 1 ) is analyzed. If coordinates of subsequent data fall within the limits, the point is designated as accepted data 39 and its recorded as being the last point accepted. The data that does not fall within the circle is designated as rejected data 40. Every time the numeric circle 38 accepts a new data, it is taken as one of the last N data.
- the break point 35 is detected as being the point where the histogram begins to curve, and it is presumably where the document 3 starts to separate from the platen 2.
- a graphical interface ( Figure 12) may be displayed to the user to allow the user to correct the distortion coefficient for the capturing image.
- the graphical interface comprises several handles and guide lines that allow the user to correct the calculated distortion values of the captured image.
- the graphical interface comprises: (i) at least two handles 43 for identifying the straight portion of the document 3; (n) at least two handles 45 and 46 for each page of the book for designating an arc that is representative of the warping in the captured image due to the binding; and (m) handles 47 and 48 for designating the spine of the book
- handles 49 may be used to designate a section of a page where glare is appearing The user has the option of manipulating the handles to correct the displayed data
- the distortion correction techniques described more fully below may be performed automatically or in conjunction with changes indicated by the use via use of the handles 42-49.
- spatial distortion 51 is caused by the binding 4 of the book, which makes the upper areas 50 of the document 3 tend to bend towards the lower part 53 of the document 3 and it makes the lower part of the document 53 tend to bend upwards to the towards the upper areas 50 of the document 3.
- the upper area 50 lines of the document have a more marked bend than the horizontal lines that are closer to the middle 52 of the image of the document 3. Equations 2-5 may be used to determine the correct line of pixel that falls within a row having deformation.
- Figure 13 illustrates an image having spatial distortion 51 and Figure 14 illustrates an image after the spatial distortion 51 has been corrected
- Equation 5 indicates a new column for the selected pixel filling within the deformation area
- Figure 15 illustrates a side view of a document image when it is laying flat on the platen 2.
- the compression of the image is caused due to the fact that the document 3 separates from the platen 2, it registers an image of a smaller longitudinal length 55 than it would if it were laying flat This causes the image of the document 3 to appear compressed as the curvature of the document 3 increases, whereas the flat length 56 of the document 3 is longer.
- Figure 16A illustrates a document that scanned using known copiers.
- Figure 16B illustrates the document after it has been decompressed by the foregoing process. It is noted that changes are more notorious in columns closer to the binding of the book.
- illumination distortion is removed from the captured image.
- Figure 17A illustrates an image without any illumination correction
- Figure 17B illustrates an image after the illumination factors have been applied.
- the illumination distortion removal process accounts for the fact that illumination is lost in a linear and hyperbolic manner; that is, in the most upper area 50 and lower area 53 of the document 3, illumination is the worst, and in the middle part 52, it is not lost as much
- the process to increase illumination has two factors: the linear increase and the parabolic increase
- the linear increase is the same for the entire column that is being fixed. That is, the amount of increase is the same for a pixel 20 that is on the upper part 50 of the document, than it is for the pixel 20 that is in the middle part 52 or at the lower part 53 of the document.
- the second increase is made to fix the change of illumination in the page. This is due to the fact that the illumination in an image tends to be greater in the center part 52 of the document than on the upper part 50 and lower part 53 of the document 3. This can be corrected by using a parabolic method to manipulate the increase of illumination in a manner that it will be null in the middle 52 of the document and greatest in the extreme upper part 50 and lower part 53 Equations 8 and 9, set forth below, is used to correct the illumination
- Equations 8 and 9 determine the parabolic and the linear increase that should be added to the pixel Equations 8 and 0 are applied separately for each of the three colors of a selected pixel whose illumination has been distorted.
- any glare 62 of the captured image is corrected.
- Glare may be produced due to the bouncing of light when the page is being scanned. It is noted that inner part 63 of the area with glare is brighter than lateral areas 64 of the glare.
- Figure 18 illustrates an image with a glare and an image without a glare.
- Figure 19 illustrates how the light is emitted form lamp 7, reflected by reflector 8 and directed towards the left page 15 and reflected on to the right page 16. Since the image on the right page 16 is receiving light from both the lamp 7 and the left page 15, the right page 16 receives an excessive amount of illumination that causes many of the pixels 20 of this area to lose their color and give an effect of a glare 62 One way of reducing this excessive amount of light or glare 62 in the image of the document 3 is to reconstruct the color of the original pixel 20. Equation 10, set forth below, may be used to eliminate glare.
- Equation 10 the area of the light glare 62 is darkened and the pixels acquire a color that is very similar to the original color. Since the colors that are darker lose more color, Equation 10 adds more color to these pixels.
- Focus distortion is caused when the document 3 moves away from the platen 2, as it goes towards the binding 4 of the book.
- Figure 20A illustrates an image that is out of focus in the area where the document is not aligned to the platen 2 and
- Figure 20B illustrates an image to which the out of focus correction factors have been applied
- Figures 21A and 21 B illustrates how the focusing assembly 12 is focussed on the platen 2 with a specific distance between the platen 2 and the document 3. If document image is not flat, as is the case with most bound documents, distance between the platen and the document 3 varies and produces a defocused image A method for correcting this out of focus is described below.
- Figure 22B illustrates how focus distortion can distort a pixel, such as is shown in Figure 22A
- the focus distortion causes a mixture of information amongst pixels and visual clearness to be lost.
- One manner of fixing this problem is to detect the pixels that have lost their color and to replace these, and to subtract color from those pixels that have gained color.
- Figure 23 illustrates a pivot plane 64 that analyzes and fixes the out of focus portions of the captured image.
- the out of focus portion of the image typically begins where the value of the pendant begins to increase, i.e., the document image is separating from the platen 2.
- a pivot plane 64 is created that is composed of three specific areas: the pivot area 63, and sweeping areas 66 and 67
- the pivot pixel 65 is the pixel 20 that is in the center and which has become out of focus.
- the pivot area 63 is the area surrounding the pivot pixel 65 In one embodiment, the pivot area 63 is a 5
- the sweeping areas 66 and 67 respectively include two areas of 3
- the function of these sweeping areas 66 and 67 is to pick up noise that may affect the pivot pixel 65.
- the sweeping areas 66 and 67 are used to find the contrast between the pixels of the sweeping area and the central pixel of the sweeping area If there is a contrast between them, the central pixel of the sweeping area takes the average of the pixels of the sweeping area being analyzed.
- the left sweeping area 66 is used when the left page of the document is analyzed, and the right sweeping area 67 is used when the right page is analyzed.
- pivot pixel 65 When pivot pixel 65 is being analyzed, it is determined whether the pivot pixel 65 is classified as being either important, irrelevant, or as borrowing color from a neighbor. This is done by the analyzing the contrast between pixels of the pivot area 63 and the pivot pixel 65. If a pixel is part of a letter, i.e., an important pixel, focus distortion can cause the surrounding pixels to take the color of it If there is an important contrast between the pivot pixel and its neighbors, the pivot pixel 65 is designated as an important pixel
- each of the pixels in the pivot area 63 excluding the pivot pixel is added. This sum is then subtracted from the pivot pixel multiplied by 24 (5 * 5 1 ) to thereby provide a contrast value
- each of the pixels in the pivot area 64 should be smaller than the average of the pixels in the area plus the contrast value; and (in) the contrast value should be greater than 130
- the pixel is not important, it is either an irrelevant pixel or a pixel that borrows color from a neighbor. If the pivot is less than 220 and the contrast value is less than 130, the pivot pixel 65 is a pixel that borrows color from its neighbors
- Color each one of the red, green, or blue color values.
- irrelevant pixels are not modified.
- Color is removed from pixels that are classified as borrowing color from their neighbor Equation 12, set forth below, describes how to calculate a new color value of a pixel that borrows color from its neighbor.
- Color each one of the red, green, and blue colors.
- the new values are not immediately set but are stored until the pixel area 64 has moved past the respective pixel. It is noted that after two lines of the captured image have been fixed, such as is shown in Figure 24, that the correction of these lines could affect results of correcting the focus distortion of an uncorrected pixel. However, when out of focus appears, pixels typically borrow information from neighbors and the correction of such pixels in the pixel window does not affect the cleanup of the focus distortion. Furthermore, as noted above, the original pixels of those pixels positioned to the left or right of an unfocused pixel are used in during the process of removing focus distortion, instead of their corrected values
- the new and less distorted image data is then either saved on the computer or other device For it can be coupled to an output apparatus that will print an image of a less distorted image, than that which was originally presented.
- the image is captured using an optical sensor in a photocopying machine, a scanner, a digital copier, a camera, a digital camera, a line studio camera, a mopier, and other apparatuses may also be used to implement said process.
- the process may be used with a particular machine as such, or it may be used as an intermediary step.
- the foregoing system and method may also be integrated with an OCR processing device
- the foregoing system and method could also be implemented as software routines that are written in a programming language such as C, C + +, Fortran, or Pascal.
- the process may also be used to apply the correction factors to an image that has been previously acquired either by a photocopy or another device. Processing can also be adjusted to account for the textual content of the document image being vertically oriented. The process may be used both for color and for black and white text. The process may also be implemented by fragmenting the document image into blocks and processing the blocks sequentially or in a parallel manner
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Facsimile Scanning Arrangements (AREA)
- Facsimile Image Signal Circuits (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU69019/00A AU6901900A (en) | 1999-08-11 | 2000-08-11 | System and method for processing optically scanned documents |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14841999P | 1999-08-11 | 1999-08-11 | |
US60/148,419 | 1999-08-11 | ||
US18201400P | 2000-02-11 | 2000-02-11 | |
US60/182,014 | 2000-02-11 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2001011864A2 true WO2001011864A2 (en) | 2001-02-15 |
WO2001011864A3 WO2001011864A3 (en) | 2001-07-05 |
Family
ID=26845839
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2000/022036 WO2001011864A2 (en) | 1999-08-11 | 2000-08-11 | System and method for processing optically scanned documents |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU6901900A (en) |
WO (1) | WO2001011864A2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1879374A1 (en) * | 2006-07-14 | 2008-01-16 | Samsung Electronics Co., Ltd. | Image processing apparatus and method |
US8306335B2 (en) | 2011-03-30 | 2012-11-06 | Seiko Epson Corporation | Method of analyzing digital document images |
US8413037B2 (en) | 2010-06-27 | 2013-04-02 | Hewlett-Packard Development Company, L.P. | User selection of flaw present within digitally scanned document |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0937031A (en) * | 1995-07-21 | 1997-02-07 | Minolta Co Ltd | Original reader |
JPH118740A (en) * | 1997-04-25 | 1999-01-12 | Ricoh Co Ltd | Picture reader |
JPH11187171A (en) * | 1997-12-17 | 1999-07-09 | Minolta Co Ltd | Image reader |
-
2000
- 2000-08-11 AU AU69019/00A patent/AU6901900A/en not_active Abandoned
- 2000-08-11 WO PCT/US2000/022036 patent/WO2001011864A2/en active Application Filing
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1879374A1 (en) * | 2006-07-14 | 2008-01-16 | Samsung Electronics Co., Ltd. | Image processing apparatus and method |
US7848590B2 (en) | 2006-07-14 | 2010-12-07 | Samsung Electronics Co., Ltd. | Image processing apparatus and method of removing regions or dividing portions of an input image to reduce computation |
US8238659B2 (en) | 2006-07-14 | 2012-08-07 | Samsung Electronics Co., Ltd. | Image processing apparatus and method of determining a region of an input image and adjusting pixel values of the region |
US8413037B2 (en) | 2010-06-27 | 2013-04-02 | Hewlett-Packard Development Company, L.P. | User selection of flaw present within digitally scanned document |
US8306335B2 (en) | 2011-03-30 | 2012-11-06 | Seiko Epson Corporation | Method of analyzing digital document images |
Also Published As
Publication number | Publication date |
---|---|
AU6901900A (en) | 2001-03-05 |
WO2001011864A3 (en) | 2001-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3376077B2 (en) | Distortion correction device, distortion correction method, and image processing device | |
US7068852B2 (en) | Edge detection and sharpening process for an image | |
EP1221810B1 (en) | Adaptive illumination correction of scanned images | |
US5969829A (en) | Image reader that stores sets of luminance correction data corresponding to document a surface height | |
JPH09238248A (en) | Method for correcting read image and image reader | |
JP3695163B2 (en) | Image forming apparatus | |
US5987163A (en) | Apparatus and method for determining the size of a page size in an image reader | |
US5847884A (en) | Image reading apparatus | |
US5834762A (en) | Image reading apparatus and method | |
WO2001011864A2 (en) | System and method for processing optically scanned documents | |
JPH08237485A (en) | Image reader | |
US20050019072A1 (en) | Scanning system for copying documents | |
JP3852247B2 (en) | Image forming apparatus and transfer image distortion correction method | |
JP3184602B2 (en) | Image reading device | |
JP3135240B2 (en) | Image processing device | |
JPH09130516A (en) | Image processing unit recognizing top and bottom of original image | |
JP3430776B2 (en) | Image reading device | |
JPH04238457A (en) | Electrophotographic device | |
JP3675181B2 (en) | Image recognition device | |
KR100895622B1 (en) | Image forming apparatus and image forming method | |
JPH1093776A (en) | Image reader | |
JPH09266528A (en) | Image reader | |
JPH1093778A (en) | Image reader | |
JPH10136194A (en) | Image reader | |
JP2003125172A (en) | Control method of image reading device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |