US20030025926A1 - Image processing apparatus and image processing method - Google Patents
Image processing apparatus and image processing method Download PDFInfo
- Publication number
- US20030025926A1 US20030025926A1 US09/921,703 US92170301A US2003025926A1 US 20030025926 A1 US20030025926 A1 US 20030025926A1 US 92170301 A US92170301 A US 92170301A US 2003025926 A1 US2003025926 A1 US 2003025926A1
- Authority
- US
- United States
- Prior art keywords
- image
- data
- discrimination
- discrimination data
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K15/00—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
- G06K15/02—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using printers
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/10—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
- G02B6/12—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
- G02B6/12007—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind forming wavelength selective elements, e.g. multiplexer, demultiplexer
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/10—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
- G02B6/12—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
- G02B6/122—Basic optical elements, e.g. light-guiding paths
- G02B6/125—Bends, branchings or intersections
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K15/00—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
- G06K15/02—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using printers
- G06K15/18—Conditioning data for presenting it to the physical printing elements
- G06K15/1848—Generation of the printable image
- G06K15/1849—Generation of the printable image using an intermediate representation, e.g. a list of graphical primitives
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K15/00—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
- G06K15/02—Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using printers
- G06K15/18—Conditioning data for presenting it to the physical printing elements
- G06K15/1848—Generation of the printable image
- G06K15/1852—Generation of the printable image involving combining data of different types
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/58—Edge or detail enhancement; Noise or error suppression, e.g. colour misregistration correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6072—Colour correction or control adapting to different types of images, e.g. characters, graphs, black and white image portions
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L2224/00—Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
- H01L2224/01—Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
- H01L2224/42—Wire connectors; Manufacturing methods related thereto
- H01L2224/47—Structure, shape, material or disposition of the wire connectors after the connecting process
- H01L2224/48—Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
- H01L2224/4805—Shape
- H01L2224/4809—Loop shape
- H01L2224/48091—Arched
Definitions
- the present invention relates to an image processing apparatus and an image processing method, and more particularly to an image processing apparatus and an image processing method for outputting page description information with high quality, which has been formed by an image forming apparatus such as a personal computer.
- page description information such as DTP data
- an image output apparatus such as a printer
- the data to be output is sent to an image output apparatus such as a printer or an MFP via a printer controller that receives the page description information and develops it to image data comprising pixel arrays of four colors, Cyan, Magenta, Yellow and Black, which represent ink amounts.
- the printer controller not only performs development to image data but also produces discrimination data representative of attributes of respective pixels of the image data.
- Jpn. Pat. Appln. KOKAI Publication No. 9-282472 discloses a technique wherein characters or given discrimination signals representing other attributes, as well as image data, are produced and transmitted, and the image data is subjected to an image process corresponding to the discrimination signals in an image output apparatus.
- image data includes character information
- an image process for example, for preventing degradation in quality of characters is performed and the data is output from the image output apparatus.
- Jpn. Pat. Appln. KOKAI Publication No. 2000-270213 discloses a technique wherein generated discrimination data is converted to data representing correspondency with image data, thereby reducing the memory capacity needed for storing the discrimination data.
- image development means i.e. printer controller
- printer controller simultaneously produces image data and discrimination data on the basis of page description information, and the image data is output from an image forming apparatus capable of switching image processes according to the discrimination data.
- an ordinary printer controller is unable to generate desired discrimination data, and thus the printer controller is limited to a specific type.
- image data matching with characteristics of an output apparatus is not necessarily produced.
- image data is produced in ordinary cases that the black character portion is written in black alone and there is no information on the color of the background.
- this image data is output as such from a printer, if an error has occurred in print position between black ink and color ink, a colorless portion forms around the character and the image quality deteriorates.
- the object of the present invention is to provide an image processing apparatus and an image processing method capable of performing a high-image-quality image process matching with output characteristics of a printer, even in a case where an ordinary printer controller is used.
- the present invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data and the first discrimination data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- the invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the first discrimination data generated by the image development means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- the invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means or using the first image data and the first discrimination data; image processing means for subjecting the first image data generated by the image development means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; and image output means for outputting image data processed by the image processing means.
- the invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the type of attributes set by the setting means and the first image data and the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- the invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- the invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image processing means for subjecting the first image data input by the input means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- the invention provides an image processing method for image-processing information described in a page description language, and outputting an image, comprising: generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of the information described in the page description language; generating second discrimination data different from the first discrimination data, using the generated first image data and first discrimination data; generating second image data by correcting the generated first image data on the basis of the generated second discrimination data; subjecting the generated second image data to a predetermined process on the basis of the generated second discrimination data; and outputting processed image data.
- FIG. 1 is a block diagram showing the structure of an image processing apparatus according to a first embodiment of the present invention
- FIG. 2 shows an example of the structure of image development means
- FIG. 3 shows an example of the structure of discrimination data generating means
- FIG. 4 shows an example of the structure of an edge detection section in the discrimination data generating means
- FIG. 5 shows an example of the structure of a color detection section in the discrimination data generating means
- FIG. 6 shows an example of the structure of a synthetic determination section in the discrimination data generating means
- FIG. 7 shows an example of conversion by a converter
- FIG. 8 shows an example of the structure of image data generating means
- FIG. 9A shows an example of first image data
- FIG. 9B shows an example of second image data in a case where an output value of the first image data has been replaced
- FIG. 10 is a view for describing a smoothing process
- FIG. 11 shows an example of the structure of image processing means
- FIG. 12 shows an example of a correction table
- FIG. 13 shows an example of the correction table
- FIG. 14 shows an example of the correction table
- FIG. 15 shows an example of the correction table
- FIG. 16 shows an example of the correction table
- FIG. 17 shows an example of the correction table
- FIG. 18 shows an example of the correction table
- FIG. 19 shows an example of the correction table
- FIG. 20 is a b lock diagram showing the structure of an image processing apparatus according to a second embodiment
- FIG. 21 is a block diagram showing the structure of an image processing apparatus according to a third embodiment
- FIG. 22 is a block diagram showing the structure of an image processing apparatus according to a fourth embodiment
- FIG. 23 is a block diagram showing the structure of an image processing apparatus according to a fifth embodiment.
- FIG. 24 is a block diagram showing the structure of an image processing apparatus according to a sixth embodiment.
- FIG. 1 is a block diagram showing the structure of an image processing apparatus 1 according to a first embodiment of the present invention.
- This image processing apparatus 1 is called a printer in usual cases.
- the apparatus receives document data, etc. produced by a personal computer via a network, etc., generates image data comprising toner amount information, and transfers toner on paper, thus performing image formation.
- the image processing apparatus 1 comprises image development means (controller unit) 11 , discrimination data generating means 12 , image data generating means 13 , image processing means 14 , and image output means (printer) 15 .
- the image development means 11 receives DTP (Desk Top Publishing) data formed on a personal computer or document data of a word processor, etc. as page information described in a page description language (PDL).
- DTP Desk Top Publishing
- the image development means 11 develops the received data to first image data as bit map data and to first discrimination data representative of attributes of each pixel.
- the page information contains characters as font data, figures as line description data or painted-out region data, and others as ordinary raster image data.
- the page information is output as a print image, it is necessary to develop all data as the same bit map data.
- the image processing apparatus may be constructed such that the image development means 11 is provided as an external element as a printer controller.
- the discrimination data generating means 12 generates second discrimination data for each pixel, which is necessary for controlling the image processing means 14 , on the basis of the first image data and the first discrimination data.
- the second discrimination data differs from the first discrimination data and corresponds to an image area discrimination signal that is commonly used in a copying machine, etc.
- the second discrimination data can be generated from the scanner input image.
- the image data generating means 13 corrects the first image data on the basis of the second discrimination data generated by the discrimination data generating means 12 , and thus generates second image data.
- the correction of the image data in this context is effected by an over-print process, which is performed based on the fact that a white blank portion forms between a black line and a C, M, or Y color component background due to a print position error at the time of printing out, a trapping process, a character smoothing process, etc.
- the image processing means 14 performs a process for emphasizing an image (in particular, a character) at the time of printing out.
- General methods of the process are filtering, gamma correction, etc.
- a filter coefficient or a gamma correction table is switched in accordance with the second discrimination data.
- the image output means 15 uses output image data (corresponding to the ink amount of each color in the case of a printer) generated by the image processing means 14 , and transfers ink on a printing medium (paper, etc.).
- FIG. 2 shows an example of the structure of the image development means 11 .
- the image development means 11 comprises a CPU 21 , a RAM 22 and a page memory 23 .
- the page information received by the image development means 11 is converted to first image data and first discrimination data by the CPU 21 , which is then developed in the page memory 23 and transmitted pixel by pixel.
- FIG. 3 shows an example of the structure of the discrimination data generating means 12 .
- the discrimination data generating means 12 comprises line buffers 31 a and 31 b , an edge detection section 32 , a color detection section 33 and a synthetic determination section 34 .
- the first image data transmitted from the image development means 11 is input to the line buffer 31 a of the discrimination data generating means 12 .
- the first image data is accumulated in the line buffer 31 a by several lines, thereby forming block data.
- the first image data output from the line buffer 31 a is sent to the edge detection section 32 , and it is determined for each color component whether a center pixel (“pixel of interest”) of the block corresponds to an edge portion.
- the first image data output from the line buffer 31 a is sent to the color detection section 33 , and it is determined based on the chroma whether the pixel of interest has an achromatic color or a chromatic color.
- the first discrimination data transmitted from the image development means 11 is input to the line buffer 31 b of the discrimination data generating means 12 .
- the line buffer 31 b is used for establishing synchronism with the first image data.
- the synthetic determination section 34 outputs second discrimination data by performing synthetic determination on the basis of the edge detection result from the edge detection section 32 , the determination result from the color detection section 33 , and the first discrimination data synchronized by the line buffer 31 b.
- FIG. 4 shows an example of the structure of the edge detection section 32 in the discrimination data generating means 12 .
- the edge detection section 32 comprises Multipliers 41 a and 41 b , adders 42 a and 42 b , positive number generators 43 a and 43 b , an adder 44 and a comparator 45 .
- the edge detection section 32 is provided for each of the color component image signals C, M, Y and K of the first image data input from the line buffer 31 a , and the edge detection is performed in parallel.
- the multiplier 41 a multiplies a 3 ⁇ 3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol A.
- the adder 42 a adds calculated values of the multiplier 41 a .
- the positive number generator 43 a produces an absolute value of the value calculated by the adder 42 a.
- the multiplier 41 b multiplies a 3 ⁇ 3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol B.
- the adder 42 b adds calculated values of the multiplier 41 b .
- the positive number generator 43 b produces an absolute value of the value calculated by the adder 42 b.
- the adder 44 adds the two absolute values obtained by the absolute value generators 43 a and 43 b .
- the comparator 45 compares the added value with a predetermined value, thereby determining the presence/absence of the edge.
- the comparison result of the comparator 45 is output to the synthetic determination section 34 as an edge determination result EC, EM, EY, EK, in association with a color component image signal C, M, Y, K in the first image data input from the line buffer 31 a.
- FIG. 5 shows an example of the structure of the color detection section 33 in the discrimination data generating means 12 .
- the color detection section 33 comprises subtracters 51 a , 51 b and 51 c , positive number generators 52 a , 52 b and 52 c , a maximum value selector 53 , a comparator 54 , digitizers 55 a , 55 b , 55 c and 55 d , selectors 56 a , 56 b , 56 c and 56 d , AND gates 57 a , 57 b and 57 c , and a NOT gate 58 .
- the subtracter 51 a calculates a difference in density between color components (C, Y) of image signals of the first image data input from the line buffer 31 a , and outputs the difference to the positive number generator 52 a .
- the positive number generator 52 a produces an absolute value of the input density difference between the color components (C, Y), and outputs the absolute value to the maximum value selector 53 .
- the subtracter 51 b calculates a difference in density between color components (C, M) of image signals of the first image data input from the line buffer 31 a , and outputs the difference to the positive number generator 52 b .
- the positive number generator 52 b produces an absolute value of the input density difference between the color components (C, M), and outputs the absolute value to the maximum value selector 53 .
- the subtracter 51 c calculates a difference in density between color components (M, Y) of image signals of the first image data input from the line buffer 31 a , and outputs the difference to the positive number generator 52 c .
- the positive number generator 52 c produces an absolute value of the input density difference between the color components (M, Y), and outputs the absolute value to the maximum value selector 53 .
- the maximum value selector 53 selects a maximum of the values input from the positive number generators 52 a , 52 b and 53 c , and outputs the maximum value to the comparator 54 .
- the comparator 54 compares the input maximum value and a predetermined value, and determines whether the color is achromatic or chromatic.
- the digitizer 55 a digitizes the density of the color component image signal C of the first image data input from the line buffer 31 a .
- the digitizer 55 b digitizes the density of the color component image signal M of the first image data input from the line buffer 31 a .
- the digitizer 55 c digitizes the density of the color component image signal Y of the first image data input from the line buffer 31 a .
- the digitizer 55 d digitizes the density of the color component image signal K of the first image data input from the line buffer 31 a.
- the digitized result represents which color component is effective in the synthetic determination section.
- the digitized result of the image signal K i.e. the output of the digitizer 55 d
- the digitized result of the image signal K is “1”
- a black over-print process for incorporating a background density in the image data of the color component C, M, Y
- An AND value between the digitized result of the image signal C, M, Y and an inverted value of the digitized result of the image signal K is obtained.
- the digitized result of the digitizer 55 a and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 a to produce an AND value.
- the digitized result of the digitizer 55 b and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 b to produce an AND value.
- the digitized result of the digitizer 55 c and an inverted value of the digitized result of the digitizer 55 d are input to the AND gate 57 c to produce an AND value.
- the selector 56 a receives the comparison result of the comparator 54 and the AND value of the AND gate 57 a , selects one of them, and outputs a select result SC.
- the selector 56 b receives the comparison result of the comparator 54 and the AND value of the AND gate 57 b , selects one of them, and outputs a select result SM.
- the selector 56 c receives the comparison result of the comparator 54 and the AND value of the AND gate 57 c , selects one of them, and outputs a select result SY.
- the selector 56 d receives the comparison result of the comparator 54 , which has been inverted by the NOT gate 58 , and the digitized result of the digitizer 55 d , selects one of them, and outputs a select result SK.
- FIG. 6 shows an example of the structure of the synthetic determination section 34 in the discrimination data generating means 12 .
- the synthetic determination section 34 comprises converters 61 a , 61 b , 61 c and 61 d , and AND gates 62 a , 62 b , 62 c and 62 d.
- Signals EC, EM, EY and EK input from the edge detection sections 32 associated with the image signals C, M, Y and K represent the edge detection results of C, M, Y and K.
- Signals SC, SM, SY and SK input from the color detector 33 represent the color detection results of C, M, Y and K.
- the converter 61 a receives the edge detection result EC from the edge detector 32 and the first discrimination data from the line buffer 31 b , and outputs desired converted discrimination data based on them.
- the converter 61 b receives the edge detection result EM from the edge detector 32 and the first discrimination data from the line buffer 31 b , and outputs desired converted discrimination data based on them.
- the converter 61 c receives the edge detection result EY from the edge detector 32 and the first discrimination data from the line buffer 31 b , and outputs desired converted discrimination data based on them.
- the converter 61 d receives the edge detection result EK from the edge detector 32 and the first discrimination data from the line buffer 31 b , and outputs desired converted discrimination data based on them.
- FIG. 7 shows an example of conversion by the converters 61 a , 61 b , 61 c and 61 d .
- the first discrimination data is classified such that a character described as font data with a predetermined size or less is “TEXT”, an object described as line description data or painted-out data and a character other than “TEXT” are “GRAPHIC”, and an object other than “TEXT” and “GRAPHIC” is “IMAGE”.
- the second discrimination data is output as “NEW-TEXT” (conversion result).
- the first discrimination data is “IMAGE” and the edge detection result is “NON-EDGE”
- the second discrimination data is output as “NEW-GRAPHIC” (conversion result).
- the desired discrimination data (second discrimination data) output from the converter 61 a is input to the AND gate 62 a .
- the AND gate 62 a produces second discrimination data DC as an AND value between the desired discrimination data input from the converter 61 a and the color detection result SC input from the color detection section 33 .
- the desired discrimination data (second discrimination data) output from the converter 61 b is input to the AND gate 62 b .
- the AND gate 62 b produces second discrimination data DM as an AND value between the desired discrimination data input from the converter 61 b and the color detection result SM input from the color detection section 33 .
- the desired discrimination data (second discrimination data) output from the converter 61 c is input to the AND gate 62 c .
- the AND gate 62 c produces second discrimination data DY as an AND value between the desired discrimination data input from the converter 61 c and the color detection result SY input from the color detection section 33 .
- the desired discrimination data (second discrimination data) output from the converter 61 d is input to the AND gate 62 d .
- the AND gate 62 d produces second discrimination data DK as an AND value between the desired discrimination data input from the converter 61 d and the color detection result SK input from the color detection section 33 .
- FIG. 8 shows an example of the structure of the image data generating means 13 .
- the image data generating means 13 comprises line buffers 71 a and 71 b , a background density averaging section 72 , a character density averaging section 73 , and a selector 74 .
- the line buffer 71 a accumulates n-lines of the first image data output from the image development means 11 .
- the line buffer 71 b accumulates n-lines of the second discrimination data output from the discrimination data generating means 12 .
- the background density averaging section 72 calculates the average density of each of the color components C, M and Y as regards a pixel within n ⁇ n pixels around a pixel of interest, with respect to which the second discrimination data DK on the color component K is not “NEW-TEXT”.
- the character density averaging section 73 calculates the average density of each of the color components C, M, Y and K within an area of m ⁇ m pixels (m ⁇ n) around the pixel of interest.
- the selector 74 outputs second image data by properly replacing the pixel values in accordance with the second discrimination data on the pixel of interest.
- the data C, M, Y of the pixel of interest shown in FIG. 9A is changed to the output value of the background density averaging section 72 as shown in FIG. 9B (over-print process or trapping process).
- the first image data C, M, Y, K shown in FIG. 9A is replaced with the output value of the second image data C, M, Y, K shown in FIG. 9B.
- FIG. 11 shows an example of the structure of the image processing means 14 .
- the image processing means 14 comprises line buffers 101 a and 101 b , a filter section 102 , a gamma correction section 103 , and a screen processing section 104 .
- the line buffer 101 a accumulates several lines of the second image data generated by the image data generating means 13 for the purpose of filter processing.
- the line buffer 101 b outputs the second discrimination data on the pixel of interest (center pixel of an image matrix) in synchronism with the second image data.
- the filter section 102 multiplies each pixel of the image matrix buffered by the line buffer 101 b with a predetermined coefficient, thus calculating the sum.
- the filter section 102 changes the coefficient for multiplication in accordance with the second discrimination data output synchronously from the line buffer 101 b.
- the gamma correction section 103 corrects each pixel of the second image data for each color component, using correction tables as shown in FIGS. 12 to 19 .
- the gamma correction section 103 switches the correction table in accordance with the second discrimination data output synchronously from the line buffer 101 b.
- a correction table shown in FIG. 12 relates to correction of color component C in a case where the second discrimination data is “NEW-TEXT”.
- a correction table shown in FIG. 13 relates to correction of color component C in a case where the second discrimination data is not “NEW-TEXT”.
- a correction table shown in FIG. 14 relates to correction of color component M in a case where the second discrimination data is “NEW-TEXT”.
- a correction table shown in FIG. 15 relates to correction of color component M in a case where the second discrimination data is not “NEW-TEXT”.
- a correction table shown in FIG. 16 relates to correction of color component Y in a case where the second discrimination data is “NEW-TEXT”.
- a correction table shown in FIG. 17 relates to correction of color component Y in a case where the second discrimination data is not “NEW-TEXT”.
- a correction table shown in FIG. 18 relates to correction of color component K in a case where the second discrimination data is “NEW-TEXT”.
- a correction table shown in FIG. 19 relates to correction of color component K in a case where the second discrimination data is not “NEW-TEXT”.
- the screen processing section 104 processes each pixel of the corrected second image data input from the gamma correction section 103 in accordance with the second discrimination data input synchronously from the line buffer 101 b , thereby outputting image data of each color component matching with the image output means 15 in the rear stage.
- the processing is, for example, an error spreading process for converting image data of 8 bits per pixel (256 tone levels) to image data of 1 bit (2 tone levels).
- the image output means 15 transfers the output image data from the screen processing section 104 onto printing medium (paper or the like).
- the first discrimination data is generated from the image development means and the second discrimination data is generated from the discrimination data generating means, for example, in the following manner.
- the image development means generates first discrimination data that discriminates whether each pixel is associated with a character or a line figure, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a character or a line figure, using the first discrimination data generated by the image development means.
- the character is an object disposed in the first image data as font data.
- the line figure is an object described by a straight line and a curve.
- the image development means generates first discrimination data that does not discriminate whether each pixel is associated with a line figure or a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a line figure or a plane figure, using the first discrimination data generated by the image development means.
- the plane figure is an object, the entirety or each component of which is painted out with uniform density.
- the image development means generates first discrimination data that does not discriminate whether each pixel is associated with a contour portion or an inside portion of a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a contour portion or an inside portion of a plane figure, using the first discrimination data generated by the image development means.
- the image development means generates first discrimination data that discriminates whether each pixel is associated with a plane figure or a tone image, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a plane figure or a tone image, using the first discrimination data generated by the image development means.
- the image development means generates first discrimination data that discriminates that each pixel is associated with a tone image, and the discrimination data generating means generates second discrimination data that discriminates the magnitude of density variation in each pixel, using the first discrimination data generated by the image development means.
- the first embodiment comprises the discrimination data generating means for generating the second discrimination data on the basis of the first image data and the first discrimination data generated from the page information described in the page description language, and the image data generating means for correcting the first image data on the basis of the second discrimination data and generating the second image data, thereby performing an image quality enhancing process matching with the output characteristics of the printer.
- FIG. 20 shows the structure of an image processing apparatus 2 according to a second embodiment.
- a discrimination data generating means 122 generates second discrimination data, without using first discrimination data generated by image development means 121 . Thereby, the independency of the first discrimination data and second discrimination data is enhanced, a greater degree of freedom is provided by the circuit configuration.
- FIG. 21 shows the structure of an image processing apparatus 3 according to a third embodiment.
- the image data generating means 123 of the image processing apparatus 2 shown in FIG. 20 is omitted. Since the image processing apparatus 3 of the third embodiment does not generate the second image data, the line memory, etc. are not needed and the image processing apparatus can be formed at low cost.
- FIG. 22 shows the structure of an image processing apparatus 4 according to a fourth embodiment.
- the controller unit (image development means 11 ) of the image processing apparatus 1 shown in FIG. 1 is omitted and it is provided as an external element.
- interface means data input means 141 as interface with the external controller and discrimination type setting means 146 are provided.
- the data input means 141 of the image processing apparatus 4 is, for example, an interface unit of a LAN (Local Area Network).
- the discrimination type setting means 146 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 146 , and the discrimination type setting means 146 is preset by the operation by a user, a manager, a designer, etc.
- the discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 146 .
- an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 4 .
- FIG. 23 shows the structure of an image processing apparatus 5 according to a fifth embodiment.
- the controller unit image development means 121 of the image processing apparatus 2 shown in FIG. 20 is omitted and it is provided as an external element.
- interface means data input means 151 as interface with the external controller and discrimination type setting means 156 are provided.
- the data input means 151 of the image processing apparatus 5 is, for example, an interface unit of a LAN (Local Area Network).
- the discrimination type setting means 156 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 156 , and the discrimination type setting means 156 is preset by the operation by a user, a manager, a designer, etc.
- the discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 156 .
- an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 5 .
- FIG. 24 shows the structure of an image processing apparatus 6 according to a sixth embodiment.
- the controller unit (image development means 131 ) of the image processing apparatus 3 shown in FIG. 21 is omitted and it is provided as an external element.
- interface means data input means 161 as interface with the external controller and discrimination type setting means 165 are provided.
- the data input means 161 of the image processing apparatus 6 is, for example, an interface unit of a LAN (Local Area Network).
- the discrimination type setting means 165 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 165 , and the discrimination type setting means 165 is preset by the operation by a user, a manager, a designer, etc.
- the discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 165 .
- an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 6 .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Optical Integrated Circuits (AREA)
- Optical Communication System (AREA)
- Facsimile Image Signal Circuits (AREA)
- Image Processing (AREA)
- Optical Couplings Of Light Guides (AREA)
- Semiconductor Integrated Circuits (AREA)
- Semiconductor Lasers (AREA)
- Light Receiving Elements (AREA)
Abstract
Description
- The present invention relates to an image processing apparatus and an image processing method, and more particularly to an image processing apparatus and an image processing method for outputting page description information with high quality, which has been formed by an image forming apparatus such as a personal computer.
- In general, when page description information such as DTP data, which is formed by a personal computer, is to be output from an image output apparatus such as a printer, the data to be output is sent to an image output apparatus such as a printer or an MFP via a printer controller that receives the page description information and develops it to image data comprising pixel arrays of four colors, Cyan, Magenta, Yellow and Black, which represent ink amounts. The printer controller not only performs development to image data but also produces discrimination data representative of attributes of respective pixels of the image data.
- For example, Jpn. Pat. Appln. KOKAI Publication No. 9-282472 discloses a technique wherein characters or given discrimination signals representing other attributes, as well as image data, are produced and transmitted, and the image data is subjected to an image process corresponding to the discrimination signals in an image output apparatus. Thereby, where image data includes character information, an image process, for example, for preventing degradation in quality of characters is performed and the data is output from the image output apparatus.
- On the other hand, Jpn. Pat. Appln. KOKAI Publication No. 2000-270213 discloses a technique wherein generated discrimination data is converted to data representing correspondency with image data, thereby reducing the memory capacity needed for storing the discrimination data.
- In the technique disclosed in the above-mentioned Jpn. Pat. Appln. KOKAI Publication No. 9-282472, however, image development means (i.e. printer controller) simultaneously produces image data and discrimination data on the basis of page description information, and the image data is output from an image forming apparatus capable of switching image processes according to the discrimination data. In this case, an ordinary printer controller is unable to generate desired discrimination data, and thus the printer controller is limited to a specific type.
- Moreover, when an ordinary printer controller is used, image data matching with characteristics of an output apparatus is not necessarily produced. For example, in the case of a color image having a colored background on which black characters are written, such image data is produced in ordinary cases that the black character portion is written in black alone and there is no information on the color of the background. In the case where this image data is output as such from a printer, if an error has occurred in print position between black ink and color ink, a colorless portion forms around the character and the image quality deteriorates.
- The object of the present invention is to provide an image processing apparatus and an image processing method capable of performing a high-image-quality image process matching with output characteristics of a printer, even in a case where an ordinary printer controller is used.
- In order to achieve the object, the present invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data and the first discrimination data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- The invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the first discrimination data generated by the image development means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- The invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means or using the first image data and the first discrimination data; image processing means for subjecting the first image data generated by the image development means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; and image output means for outputting image data processed by the image processing means.
- The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the type of attributes set by the setting means and the first image data and the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image processing means for subjecting the first image data input by the input means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- The invention provides an image processing method for image-processing information described in a page description language, and outputting an image, comprising: generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of the information described in the page description language; generating second discrimination data different from the first discrimination data, using the generated first image data and first discrimination data; generating second image data by correcting the generated first image data on the basis of the generated second discrimination data; subjecting the generated second image data to a predetermined process on the basis of the generated second discrimination data; and outputting processed image data.
- FIG. 1 is a block diagram showing the structure of an image processing apparatus according to a first embodiment of the present invention;
- FIG. 2 shows an example of the structure of image development means;
- FIG. 3 shows an example of the structure of discrimination data generating means;
- FIG. 4 shows an example of the structure of an edge detection section in the discrimination data generating means;
- FIG. 5 shows an example of the structure of a color detection section in the discrimination data generating means;
- FIG. 6 shows an example of the structure of a synthetic determination section in the discrimination data generating means;
- FIG. 7 shows an example of conversion by a converter;
- FIG. 8 shows an example of the structure of image data generating means;
- FIG. 9A shows an example of first image data;
- FIG. 9B shows an example of second image data in a case where an output value of the first image data has been replaced;
- FIG. 10 is a view for describing a smoothing process;
- FIG. 11 shows an example of the structure of image processing means;
- FIG. 12 shows an example of a correction table;
- FIG. 13 shows an example of the correction table;
- FIG. 14 shows an example of the correction table;
- FIG. 15 shows an example of the correction table;
- FIG. 16 shows an example of the correction table;
- FIG. 17 shows an example of the correction table;
- FIG. 18 shows an example of the correction table;
- FIG. 19 shows an example of the correction table;
- FIG. 20 is a b lock diagram showing the structure of an image processing apparatus according to a second embodiment;
- FIG. 21 is a block diagram showing the structure of an image processing apparatus according to a third embodiment;
- FIG. 22 is a block diagram showing the structure of an image processing apparatus according to a fourth embodiment;
- FIG. 23 is a block diagram showing the structure of an image processing apparatus according to a fifth embodiment; and
- FIG. 24 is a block diagram showing the structure of an image processing apparatus according to a sixth embodiment.
- Embodiments of the present invention will now be described with reference to the accompanying drawings.
- FIG. 1 is a block diagram showing the structure of an
image processing apparatus 1 according to a first embodiment of the present invention. Thisimage processing apparatus 1 is called a printer in usual cases. The apparatus receives document data, etc. produced by a personal computer via a network, etc., generates image data comprising toner amount information, and transfers toner on paper, thus performing image formation. - The
image processing apparatus 1 comprises image development means (controller unit) 11, discrimination data generating means 12, image data generating means 13, image processing means 14, and image output means (printer) 15. - The image development means11 receives DTP (Desk Top Publishing) data formed on a personal computer or document data of a word processor, etc. as page information described in a page description language (PDL). The image development means 11 develops the received data to first image data as bit map data and to first discrimination data representative of attributes of each pixel.
- The page information contains characters as font data, figures as line description data or painted-out region data, and others as ordinary raster image data. When the page information is output as a print image, it is necessary to develop all data as the same bit map data.
- In addition, it is necessary to develop the attribute cata to pixel-by-pixel discrimination data so that the image processing means14 may perform an appropriate image quality enhancing process in accordance with attributes of image data.
- Alternatively, the image processing apparatus may be constructed such that the image development means11 is provided as an external element as a printer controller.
- The discrimination data generating means12 generates second discrimination data for each pixel, which is necessary for controlling the image processing means 14, on the basis of the first image data and the first discrimination data. The second discrimination data differs from the first discrimination data and corresponds to an image area discrimination signal that is commonly used in a copying machine, etc.
- Accordingly, even where a scanner is connected to the
image processing apparatus 1 for the purpose of use as a copying machine, the second discrimination data can be generated from the scanner input image. - It is necessary, however, to switch the method of generating the second discrimination data, depending on which of images should be treated as first image data, the images being obtained in a case where the scanner input image is processed and in a case where the information of the page description language is developed to the image.
- The image data generating means13 corrects the first image data on the basis of the second discrimination data generated by the discrimination data generating means 12, and thus generates second image data. The correction of the image data in this context is effected by an over-print process, which is performed based on the fact that a white blank portion forms between a black line and a C, M, or Y color component background due to a print position error at the time of printing out, a trapping process, a character smoothing process, etc.
- The image processing means14 performs a process for emphasizing an image (in particular, a character) at the time of printing out. General methods of the process are filtering, gamma correction, etc. A filter coefficient or a gamma correction table is switched in accordance with the second discrimination data.
- The image output means15 uses output image data (corresponding to the ink amount of each color in the case of a printer) generated by the image processing means 14, and transfers ink on a printing medium (paper, etc.).
- FIG. 2 shows an example of the structure of the image development means11. The image development means 11 comprises a
CPU 21, aRAM 22 and apage memory 23. The page information received by the image development means 11 is converted to first image data and first discrimination data by theCPU 21, which is then developed in thepage memory 23 and transmitted pixel by pixel. - FIG. 3 shows an example of the structure of the discrimination data generating means12. The discrimination data generating means 12 comprises line buffers 31 a and 31 b, an
edge detection section 32, acolor detection section 33 and asynthetic determination section 34. - The first image data transmitted from the image development means11 is input to the
line buffer 31 a of the discrimination data generating means 12. The first image data is accumulated in theline buffer 31 a by several lines, thereby forming block data. - The first image data output from the
line buffer 31 a is sent to theedge detection section 32, and it is determined for each color component whether a center pixel (“pixel of interest”) of the block corresponds to an edge portion. - In addition, the first image data output from the
line buffer 31 a is sent to thecolor detection section 33, and it is determined based on the chroma whether the pixel of interest has an achromatic color or a chromatic color. - On the other hand, the first discrimination data transmitted from the image development means11 is input to the
line buffer 31 b of the discrimination data generating means 12. Theline buffer 31 b is used for establishing synchronism with the first image data. - The
synthetic determination section 34 outputs second discrimination data by performing synthetic determination on the basis of the edge detection result from theedge detection section 32, the determination result from thecolor detection section 33, and the first discrimination data synchronized by theline buffer 31 b. - FIG. 4 shows an example of the structure of the
edge detection section 32 in the discrimination data generating means 12. Theedge detection section 32 comprisesMultipliers adders positive number generators adder 44 and acomparator 45. Theedge detection section 32 is provided for each of the color component image signals C, M, Y and K of the first image data input from theline buffer 31 a, and the edge detection is performed in parallel. - The
multiplier 41 a multiplies a 3×3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol A. Theadder 42 a adds calculated values of themultiplier 41 a. Thepositive number generator 43 a produces an absolute value of the value calculated by theadder 42 a. - The
multiplier 41 b multiplies a 3×3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol B. Theadder 42 b adds calculated values of themultiplier 41 b. Thepositive number generator 43 b produces an absolute value of the value calculated by theadder 42 b. - Subsequently, the
adder 44 adds the two absolute values obtained by theabsolute value generators comparator 45 compares the added value with a predetermined value, thereby determining the presence/absence of the edge. - The comparison result of the
comparator 45 is output to thesynthetic determination section 34 as an edge determination result EC, EM, EY, EK, in association with a color component image signal C, M, Y, K in the first image data input from theline buffer 31 a. - FIG. 5 shows an example of the structure of the
color detection section 33 in the discrimination data generating means 12. Thecolor detection section 33 comprisessubtracters positive number generators maximum value selector 53, acomparator 54,digitizers selectors gates NOT gate 58. - The
subtracter 51 a calculates a difference in density between color components (C, Y) of image signals of the first image data input from theline buffer 31 a, and outputs the difference to thepositive number generator 52 a. Thepositive number generator 52 a produces an absolute value of the input density difference between the color components (C, Y), and outputs the absolute value to themaximum value selector 53. - The
subtracter 51 b calculates a difference in density between color components (C, M) of image signals of the first image data input from theline buffer 31 a, and outputs the difference to thepositive number generator 52 b. Thepositive number generator 52 b produces an absolute value of the input density difference between the color components (C, M), and outputs the absolute value to themaximum value selector 53. - The
subtracter 51 c calculates a difference in density between color components (M, Y) of image signals of the first image data input from theline buffer 31 a, and outputs the difference to thepositive number generator 52 c. Thepositive number generator 52 c produces an absolute value of the input density difference between the color components (M, Y), and outputs the absolute value to themaximum value selector 53. - The
maximum value selector 53 selects a maximum of the values input from thepositive number generators comparator 54. - The
comparator 54 compares the input maximum value and a predetermined value, and determines whether the color is achromatic or chromatic. - On the other hand, the
digitizer 55 a digitizes the density of the color component image signal C of the first image data input from theline buffer 31 a. Thedigitizer 55 b digitizes the density of the color component image signal M of the first image data input from theline buffer 31 a. Thedigitizer 55 c digitizes the density of the color component image signal Y of the first image data input from theline buffer 31 a. Thedigitizer 55 d digitizes the density of the color component image signal K of the first image data input from theline buffer 31 a. - The digitized result represents which color component is effective in the synthetic determination section. When the digitized result of the image signal K, i.e. the output of the
digitizer 55 d, is “1”, there is a case where a black over-print process (for incorporating a background density in the image data of the color component C, M, Y) is performed at the time of image development. An AND value between the digitized result of the image signal C, M, Y and an inverted value of the digitized result of the image signal K is obtained. - Specifically, the digitized result of the
digitizer 55 a and an inverted value of the digitized result of thedigitizer 55 d are input to the ANDgate 57 a to produce an AND value. The digitized result of thedigitizer 55 b and an inverted value of the digitized result of thedigitizer 55 d are input to the ANDgate 57 b to produce an AND value. The digitized result of thedigitizer 55 c and an inverted value of the digitized result of thedigitizer 55 d are input to the ANDgate 57 c to produce an AND value. - The
selector 56 a receives the comparison result of thecomparator 54 and the AND value of the ANDgate 57 a, selects one of them, and outputs a select result SC. Theselector 56 b receives the comparison result of thecomparator 54 and the AND value of the ANDgate 57 b, selects one of them, and outputs a select result SM. Theselector 56 c receives the comparison result of thecomparator 54 and the AND value of the ANDgate 57 c, selects one of them, and outputs a select result SY. Theselector 56 d receives the comparison result of thecomparator 54, which has been inverted by theNOT gate 58, and the digitized result of thedigitizer 55 d, selects one of them, and outputs a select result SK. - This operation is performed since it is necessary to effect switching between the use as a copying machine and the use as a printer.
- FIG. 6 shows an example of the structure of the
synthetic determination section 34 in the discrimination data generating means 12. Thesynthetic determination section 34 comprisesconverters gates - Signals EC, EM, EY and EK input from the
edge detection sections 32 associated with the image signals C, M, Y and K represent the edge detection results of C, M, Y and K. Signals SC, SM, SY and SK input from thecolor detector 33 represent the color detection results of C, M, Y and K. - The
converter 61 a receives the edge detection result EC from theedge detector 32 and the first discrimination data from theline buffer 31 b, and outputs desired converted discrimination data based on them. - The
converter 61 b receives the edge detection result EM from theedge detector 32 and the first discrimination data from theline buffer 31 b, and outputs desired converted discrimination data based on them. - The
converter 61 c receives the edge detection result EY from theedge detector 32 and the first discrimination data from theline buffer 31 b, and outputs desired converted discrimination data based on them. - The
converter 61 d receives the edge detection result EK from theedge detector 32 and the first discrimination data from theline buffer 31 b, and outputs desired converted discrimination data based on them. - FIG. 7 shows an example of conversion by the
converters - For example, when the first discrimination data is “TEXT” and the edge detection result is “EDGE”, the second discrimination data is output as “NEW-TEXT” (conversion result). When the first discrimination data is “IMAGE” and the edge detection result is “NON-EDGE”, the second discrimination data is output as “NEW-GRAPHIC” (conversion result).
- The desired discrimination data (second discrimination data) output from the
converter 61 a is input to the ANDgate 62 a. The ANDgate 62 a produces second discrimination data DC as an AND value between the desired discrimination data input from theconverter 61 a and the color detection result SC input from thecolor detection section 33. - The desired discrimination data (second discrimination data) output from the
converter 61 b is input to the ANDgate 62 b. The ANDgate 62 b produces second discrimination data DM as an AND value between the desired discrimination data input from theconverter 61 b and the color detection result SM input from thecolor detection section 33. - The desired discrimination data (second discrimination data) output from the
converter 61 c is input to the ANDgate 62 c. The ANDgate 62 c produces second discrimination data DY as an AND value between the desired discrimination data input from theconverter 61 c and the color detection result SY input from thecolor detection section 33. - The desired discrimination data (second discrimination data) output from the
converter 61 d is input to the ANDgate 62 d. The ANDgate 62 d produces second discrimination data DK as an AND value between the desired discrimination data input from theconverter 61 d and the color detection result SK input from thecolor detection section 33. - FIG. 8 shows an example of the structure of the image data generating means13. The image data generating means 13 comprises line buffers 71 a and 71 b, a background
density averaging section 72, a characterdensity averaging section 73, and a selector 74. - The line buffer71 a accumulates n-lines of the first image data output from the image development means 11.
- The
line buffer 71 b accumulates n-lines of the second discrimination data output from the discrimination data generating means 12. - The background
density averaging section 72 calculates the average density of each of the color components C, M and Y as regards a pixel within n×n pixels around a pixel of interest, with respect to which the second discrimination data DK on the color component K is not “NEW-TEXT”. - On the other hand, the character
density averaging section 73 calculates the average density of each of the color components C, M, Y and K within an area of m×m pixels (m≦n) around the pixel of interest. - The selector74 outputs second image data by properly replacing the pixel values in accordance with the second discrimination data on the pixel of interest.
- For example, when the pixel value of K of the pixel of interest is “NEW-TEXT” and all the pixel values of C, M and Y are zero, the data C, M, Y of the pixel of interest shown in FIG. 9A is changed to the output value of the background
density averaging section 72 as shown in FIG. 9B (over-print process or trapping process). Specifically, the first image data C, M, Y, K shown in FIG. 9A is replaced with the output value of the second image data C, M, Y, K shown in FIG. 9B. - Similarly, when the color component of K of the pixel of interest is “NEW-TEXT”, the pixel value of K of the pixel of interest is replaced with the output value of the character
density averaging section 73, as shown in a, b and c of FIG. 10 (smoothing process, etc.). - The processing of the image data generating means13 has been described above merely by way of example, and the content of the processing is not limited to the above-described one.
- FIG. 11 shows an example of the structure of the image processing means14. The image processing means 14 comprises line buffers 101 a and 101 b, a
filter section 102, agamma correction section 103, and ascreen processing section 104. - The
line buffer 101 a accumulates several lines of the second image data generated by the image data generating means 13 for the purpose of filter processing. - The
line buffer 101 b outputs the second discrimination data on the pixel of interest (center pixel of an image matrix) in synchronism with the second image data. - The
filter section 102 multiplies each pixel of the image matrix buffered by theline buffer 101 b with a predetermined coefficient, thus calculating the sum. In this case, thefilter section 102 changes the coefficient for multiplication in accordance with the second discrimination data output synchronously from theline buffer 101 b. - The
gamma correction section 103 corrects each pixel of the second image data for each color component, using correction tables as shown in FIGS. 12 to 19. In this case, thegamma correction section 103 switches the correction table in accordance with the second discrimination data output synchronously from theline buffer 101 b. - A correction table shown in FIG. 12 relates to correction of color component C in a case where the second discrimination data is “NEW-TEXT”.
- A correction table shown in FIG. 13 relates to correction of color component C in a case where the second discrimination data is not “NEW-TEXT”.
- A correction table shown in FIG. 14 relates to correction of color component M in a case where the second discrimination data is “NEW-TEXT”.
- A correction table shown in FIG. 15 relates to correction of color component M in a case where the second discrimination data is not “NEW-TEXT”.
- A correction table shown in FIG. 16 relates to correction of color component Y in a case where the second discrimination data is “NEW-TEXT”.
- A correction table shown in FIG. 17 relates to correction of color component Y in a case where the second discrimination data is not “NEW-TEXT”.
- A correction table shown in FIG. 18 relates to correction of color component K in a case where the second discrimination data is “NEW-TEXT”.
- A correction table shown in FIG. 19 relates to correction of color component K in a case where the second discrimination data is not “NEW-TEXT”.
- The
screen processing section 104 processes each pixel of the corrected second image data input from thegamma correction section 103 in accordance with the second discrimination data input synchronously from theline buffer 101 b, thereby outputting image data of each color component matching with the image output means 15 in the rear stage. The processing is, for example, an error spreading process for converting image data of 8 bits per pixel (256 tone levels) to image data of 1 bit (2 tone levels). - The image output means15 transfers the output image data from the
screen processing section 104 onto printing medium (paper or the like). - In the first embodiment, the first discrimination data is generated from the image development means and the second discrimination data is generated from the discrimination data generating means, for example, in the following manner.
- a) The image development means generates first discrimination data that discriminates whether each pixel is associated with a character or a line figure, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a character or a line figure, using the first discrimination data generated by the image development means.
- The character is an object disposed in the first image data as font data.
- The line figure is an object described by a straight line and a curve.
- b) The image development means generates first discrimination data that does not discriminate whether each pixel is associated with a line figure or a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a line figure or a plane figure, using the first discrimination data generated by the image development means.
- The plane figure is an object, the entirety or each component of which is painted out with uniform density.
- c) The image development means generates first discrimination data that does not discriminate whether each pixel is associated with a contour portion or an inside portion of a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a contour portion or an inside portion of a plane figure, using the first discrimination data generated by the image development means.
- d) The image development means generates first discrimination data that discriminates whether each pixel is associated with a plane figure or a tone image, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a plane figure or a tone image, using the first discrimination data generated by the image development means.
- e) The image development means generates first discrimination data that discriminates that each pixel is associated with a tone image, and the discrimination data generating means generates second discrimination data that discriminates the magnitude of density variation in each pixel, using the first discrimination data generated by the image development means.
- As has been described above, the first embodiment comprises the discrimination data generating means for generating the second discrimination data on the basis of the first image data and the first discrimination data generated from the page information described in the page description language, and the image data generating means for correcting the first image data on the basis of the second discrimination data and generating the second image data, thereby performing an image quality enhancing process matching with the output characteristics of the printer.
- Second to sixth embodiments of the invention will now be described.
- FIG. 20 shows the structure of an
image processing apparatus 2 according to a second embodiment. - The main difference between the
image processing apparatus 2 of the second embodiment and theimage processing apparatus 1 shown in FIG. 1 is that a discrimination data generating means 122 generates second discrimination data, without using first discrimination data generated by image development means 121. Thereby, the independency of the first discrimination data and second discrimination data is enhanced, a greater degree of freedom is provided by the circuit configuration. - However, when image data generating means123 generates second image data and when image processing means 124 switches the processing, both the first discrimination data and second discrimination data needs to be referred to.
- FIG. 21 shows the structure of an
image processing apparatus 3 according to a third embodiment. - In the
image processing apparatus 3 of the third embodiment, the image data generating means 123 of theimage processing apparatus 2 shown in FIG. 20 is omitted. Since theimage processing apparatus 3 of the third embodiment does not generate the second image data, the line memory, etc. are not needed and the image processing apparatus can be formed at low cost. - FIG. 22 shows the structure of an
image processing apparatus 4 according to a fourth embodiment. In theimage processing apparatus 4 of the fourth embodiment, the controller unit (image development means 11) of theimage processing apparatus 1 shown in FIG. 1 is omitted and it is provided as an external element. In addition, interface means (data input means 141) as interface with the external controller and discrimination type setting means 146 are provided. - The data input means141 of the
image processing apparatus 4 is, for example, an interface unit of a LAN (Local Area Network). - The discrimination type setting means146 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 146, and the discrimination type setting means 146 is preset by the operation by a user, a manager, a designer, etc.
- The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means146.
- With this structure, an external controller that generates any kind of discrimination data can be connected to the
image processing apparatus 4. - FIG. 23 shows the structure of an image processing apparatus5 according to a fifth embodiment. In the image processing apparatus 5 of the fifth embodiment, the controller unit (image development means 121) of the
image processing apparatus 2 shown in FIG. 20 is omitted and it is provided as an external element. In addition, interface means (data input means 151) as interface with the external controller and discrimination type setting means 156 are provided. - The data input means151 of the image processing apparatus 5 is, for example, an interface unit of a LAN (Local Area Network).
- The discrimination type setting means156 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 156, and the discrimination type setting means 156 is preset by the operation by a user, a manager, a designer, etc.
- The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means156.
- With this structure, an external controller that generates any kind of discrimination data can be connected to the image processing apparatus5.
- FIG. 24 shows the structure of an
image processing apparatus 6 according to a sixth embodiment. In theimage processing apparatus 6 of the sixth embodiment, the controller unit (image development means 131) of theimage processing apparatus 3 shown in FIG. 21 is omitted and it is provided as an external element. In addition, interface means (data input means 161) as interface with the external controller and discrimination type setting means 165 are provided. - The data input means161 of the
image processing apparatus 6 is, for example, an interface unit of a LAN (Local Area Network). - The discrimination type setting means165 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 165, and the discrimination type setting means 165 is preset by the operation by a user, a manager, a designer, etc.
- The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means165.
- With this structure, an external controller that generates any kind of discrimination data can be connected to the
image processing apparatus 6. - As has been described above, according to the embodiments of the present invention, a high-image-quality image process matching with output characteristics of a printer can be performed, even in a case where an ordinary printer controller is used.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2001290212A JP2003051939A (en) | 2001-08-06 | 2001-09-21 | Image processing apparatus and image processing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2000286753A JP2002101051A (en) | 2000-09-21 | 2000-09-21 | WDM optical interconnection equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030025926A1 true US20030025926A1 (en) | 2003-02-06 |
Family
ID=34640449
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/917,707 Expired - Fee Related US6907198B2 (en) | 2000-09-21 | 2001-07-31 | Wavelength division multiplexed optical interconnection device |
US09/921,703 Abandoned US20030025926A1 (en) | 2000-09-21 | 2001-08-06 | Image processing apparatus and image processing method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/917,707 Expired - Fee Related US6907198B2 (en) | 2000-09-21 | 2001-07-31 | Wavelength division multiplexed optical interconnection device |
Country Status (2)
Country | Link |
---|---|
US (2) | US6907198B2 (en) |
JP (1) | JP2002101051A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6642993B2 (en) | 2001-12-27 | 2003-11-04 | Kabushiki Kaisha Toshiba | Image processing device and method for controlling the same |
US20040234134A1 (en) * | 2003-05-19 | 2004-11-25 | Kabushiki Kaisha Toshiba | Image processing apparatus and image processing method |
US20050185071A1 (en) * | 2004-01-23 | 2005-08-25 | Sanyo Electric Co., Ltd. | Image signal processing apparatus |
US8704447B2 (en) | 2009-05-28 | 2014-04-22 | Citizen Holdings Co., Ltd. | Light source device |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6643419B2 (en) * | 2001-12-05 | 2003-11-04 | Pacific Wave Industries, Inc. | Ultra-high speed, active polymer-silica hybrid, single control voltage MMI-based 1-by-N packet switch and WG-based WDM packet router/TDM converter and methods of making same |
JP4249474B2 (en) * | 2002-12-06 | 2009-04-02 | セイコーエプソン株式会社 | Wavelength multiplexing chip-to-chip optical interconnection circuit |
KR100541655B1 (en) * | 2004-01-07 | 2006-01-11 | 삼성전자주식회사 | Package Circuit Board and Package Using the Same |
JP4524120B2 (en) * | 2004-02-03 | 2010-08-11 | 富士通株式会社 | Blade type optical transmission equipment |
JP4462269B2 (en) * | 2004-09-29 | 2010-05-12 | 日立化成工業株式会社 | Opto-electric integrated circuit element and transmission device using the same |
US7725183B1 (en) * | 2006-02-10 | 2010-05-25 | Pacesetter, Inc. | Implantable stimulation device equipped with a hardware elastic buffer |
US7542641B1 (en) * | 2006-12-01 | 2009-06-02 | Kotura, Inc. | Multi-channel optical device |
US8041230B2 (en) * | 2008-12-12 | 2011-10-18 | Fujitsu Limited | System and method for optoelectrical communication |
US8041229B2 (en) * | 2008-12-12 | 2011-10-18 | Fujitsu Limited | System and method for optoelectrical communication |
US8965208B2 (en) * | 2009-05-22 | 2015-02-24 | Kotura, Inc. | Multi-channel optical device |
US9178620B2 (en) | 2011-09-23 | 2015-11-03 | Te Connectivity Nederland B.V. | Optical interface for bidirectional communications |
US9525490B2 (en) * | 2012-07-26 | 2016-12-20 | Aurrion, Inc. | Reconfigurable optical transmitter |
US9268890B2 (en) | 2012-09-05 | 2016-02-23 | International Business Machines Corporation | Designing photonic switching systems utilizing equalized drivers |
US8775992B2 (en) | 2012-09-05 | 2014-07-08 | International Business Machines Corporation | Designing photonic switching systems utilizing equalized drivers |
KR101999199B1 (en) * | 2013-03-12 | 2019-07-11 | 삼성전자주식회사 | Optical package |
US9446467B2 (en) | 2013-03-14 | 2016-09-20 | Taiwan Semiconductor Manufacturing Company, Ltd. | Integrate rinse module in hybrid bonding platform |
US10488682B2 (en) * | 2013-08-31 | 2019-11-26 | Acacia Communications, Inc. | Distributed CMOS driver with enhanced drive voltage for silicon optical push-pull Mach-Zehnder modulators |
CN107199405B (en) * | 2016-03-15 | 2023-04-18 | 中国科学院沈阳自动化研究所 | Automatic slicer for corn breeding and sampling |
US10133142B2 (en) | 2016-03-29 | 2018-11-20 | Acacia Communications, Inc. | Silicon modulators and related apparatus and methods |
US10177161B2 (en) * | 2016-12-28 | 2019-01-08 | Intel Corporation | Methods of forming package structures for enhanced memory capacity and structures formed thereby |
JP7181462B2 (en) | 2019-01-17 | 2022-12-01 | 日本電信電話株式会社 | photodetector |
JP2020144294A (en) * | 2019-03-08 | 2020-09-10 | ルネサスエレクトロニクス株式会社 | Semiconductor device and manufacturing method for the same |
US11728894B2 (en) * | 2020-04-13 | 2023-08-15 | Avicenatech Corp. | Optically-enhanced multichip packaging |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5724444A (en) * | 1994-09-16 | 1998-03-03 | Kabushiki Kaisha Toshiba | Image forming apparatus |
US5875036A (en) * | 1990-11-08 | 1999-02-23 | Canon Kabushiki Kaisha | Image processing apparatus which separates an image signal according to density level |
US20010013953A1 (en) * | 1999-12-27 | 2001-08-16 | Akihiko Uekusa | Image-processing method, image-processing device, and storage medium |
US6549657B2 (en) * | 1995-04-06 | 2003-04-15 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US6631207B2 (en) * | 1998-03-18 | 2003-10-07 | Minolta Co., Ltd. | Image processor including processing for image data between edge or boundary portions of image data |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5394489A (en) * | 1993-07-27 | 1995-02-28 | At&T Corp. | Wavelength division multiplexed optical communication transmitters |
-
2000
- 2000-09-21 JP JP2000286753A patent/JP2002101051A/en active Pending
-
2001
- 2001-07-31 US US09/917,707 patent/US6907198B2/en not_active Expired - Fee Related
- 2001-08-06 US US09/921,703 patent/US20030025926A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5875036A (en) * | 1990-11-08 | 1999-02-23 | Canon Kabushiki Kaisha | Image processing apparatus which separates an image signal according to density level |
US5724444A (en) * | 1994-09-16 | 1998-03-03 | Kabushiki Kaisha Toshiba | Image forming apparatus |
US6549657B2 (en) * | 1995-04-06 | 2003-04-15 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US6631207B2 (en) * | 1998-03-18 | 2003-10-07 | Minolta Co., Ltd. | Image processor including processing for image data between edge or boundary portions of image data |
US20010013953A1 (en) * | 1999-12-27 | 2001-08-16 | Akihiko Uekusa | Image-processing method, image-processing device, and storage medium |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6642993B2 (en) | 2001-12-27 | 2003-11-04 | Kabushiki Kaisha Toshiba | Image processing device and method for controlling the same |
US20040234134A1 (en) * | 2003-05-19 | 2004-11-25 | Kabushiki Kaisha Toshiba | Image processing apparatus and image processing method |
US20050185071A1 (en) * | 2004-01-23 | 2005-08-25 | Sanyo Electric Co., Ltd. | Image signal processing apparatus |
US8704447B2 (en) | 2009-05-28 | 2014-04-22 | Citizen Holdings Co., Ltd. | Light source device |
Also Published As
Publication number | Publication date |
---|---|
JP2002101051A (en) | 2002-04-05 |
US20030025962A1 (en) | 2003-02-06 |
US6907198B2 (en) | 2005-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20030025926A1 (en) | Image processing apparatus and image processing method | |
US7760934B2 (en) | Color to grayscale conversion method and apparatus utilizing a high pass filtered chrominance component | |
US5428377A (en) | Color spatial filtering for thermal ink jet printers | |
EP1014694B1 (en) | Automated enhancement of print quality based on feature size, shape, orientation, and color | |
US8477324B2 (en) | Image processor and image processing method that uses s-shaped gamma curve | |
EP1349371A2 (en) | Image processing apparatus, image processing program and storage medium storing the program | |
EP1073260B1 (en) | Image processing device, image forming device incorporating the same, and storage medium for storing program used thereby | |
JP2012518303A (en) | Image processing system for processing digital image and image processing method for processing digital image | |
US5784496A (en) | Error sum method and apparatus for intercolor separation control in a printing system | |
JP4498233B2 (en) | Image processing apparatus and image processing method | |
JP4386216B2 (en) | Color printing system and control method thereof | |
US7315398B2 (en) | Multi-level error diffusion with color image data | |
JP4153568B2 (en) | How to determine the colorant to be used for printing gray areas | |
JP4377249B2 (en) | Ink consumption reduction error diffusion | |
US7295347B2 (en) | Image processing method for generating multi-level data | |
US20100079818A1 (en) | Image forming apparatus to improve image quality and image quality improvement method | |
JP6736299B2 (en) | Printing device, printing method, and program | |
US6249354B1 (en) | Image processing apparatus and method | |
JP7510611B2 (en) | Image forming device | |
US20030197897A1 (en) | Quantization apparatus and method, and inkjet printing apparatus | |
JP2000341547A (en) | Device and method for image processing | |
JP2001309188A (en) | Image processing unit and image processing method | |
JP2003051939A (en) | Image processing apparatus and image processing method | |
JP4124900B2 (en) | Color printer and control method thereof | |
JP3864405B2 (en) | Color printing system and color printer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOSHIBA TEC KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FUCHIGAMI, TAKAHIRO;TABATA, SUNAO;REEL/FRAME:012054/0696 Effective date: 20010717 |
|
AS | Assignment |
Owner name: TOSHIBA TEC KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT (ONE-HALF INTEREST);ASSIGNOR:TOSHIBA TEC KABUSHIKI KAISHA;REEL/FRAME:014118/0099 Effective date: 20030530 Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN Free format text: ASSIGNMENT (ONE-HALF INTEREST);ASSIGNOR:TOSHIBA TEC KABUSHIKI KAISHA;REEL/FRAME:014118/0099 Effective date: 20030530 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |