CA1300735C - Color graphic image processing method - Google Patents

Color graphic image processing method

Info

Publication number
CA1300735C
CA1300735C CA000530066A CA530066A CA1300735C CA 1300735 C CA1300735 C CA 1300735C CA 000530066 A CA000530066 A CA 000530066A CA 530066 A CA530066 A CA 530066A CA 1300735 C CA1300735 C CA 1300735C
Authority
CA
Canada
Prior art keywords
color
image
data
graphic image
codes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CA000530066A
Other languages
French (fr)
Inventor
Yoshihiro Okada
Keiichiro Hyodo
Toshiyuki Sakai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Mita Industrial Co Ltd
Original Assignee
Mita Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mita Industrial Co Ltd filed Critical Mita Industrial Co Ltd
Application granted granted Critical
Publication of CA1300735C publication Critical patent/CA1300735C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/64Systems for the transmission or the storage of the colour picture signal; Details therefor, e.g. coding or decoding means therefor

Abstract

ABSTRACT OF THE DISCLOSURE

A color graphic image processing system which first reads color graphic image to be processed, and then encodes read-out signals into 4 through 6 units of encoded words per each color, and then stores the encoded words after complet-ing compression of data. The system then transmits the stored data and converts them into image signals suited for output device before eventually allowing the output device to output these image signals.

Description

~3C)~73S

SPECIFICATION

TITLE OF THE INVENTION

Color graphic image proeessing system BACKGROUND OF THE INVENTION
The present invention relates to a color graphic image processing system, more partieularly, to a novel image pro-cessing system for processing eolor graphie image.
One of the prior arts disclosed by the Japanese Patent Laid-Open No. 161982/1984 presents a system for proeessing full-color image, i.e., normal eolor image.
This system first reads full-eolor image to be pro-eessed as 8~bits input signals, and then eompensates for15 shading, i.e., eompensating unevenness of optieal luminanee, and then exeeutes gamma eompensation whieh allows the output eopy density eorresponding to the input original density to be output as 6-bits signals by degrading charaeteristie of the dither eumulative frequeney in eonjunetion with the vol-tage output from eharge-coupled deviee (CCD) in order that the relationship between these density values can remain 1:1 independent of uneven characteristies of input and output systems. The system presented by the above prior art then properly adjusts tonal charaeteristies of respective chromi-nance signals in accordanee with masking compensation and ~ .

3L3U~)73~

UCR compensation processes. The system then converts half-tone image into binary codes using dither processing cir-cuit, and then it executes pulse width modulation using multilevel conversion circuit to improve tonal characteris-tic of half-tone image before eventually allowing laser printer to reproduce picture identical to the image to be processed.
Another prior art disclosed by the Japanese Patent Laid-Open No. 57573/1984 proposes a techni~ue for sharpening the picture i~lage by converting those signals other than white level close to white area into signals indicating white color before encoding these into binary codes simul-taneously by converting those signals other than black level close ~o black area into signals indicating black color.
The former system related to the ~apanese Patent Laid-Open No. 161982/198~ aims at applying process to full-color image. Concretely, by applying 8-bits data to each of three primary colors comprised of red, green, and blue components, this system allows each picture element to represent about 16 million colors. This system executes all the needed processing operations by reading image signals of full-color consisting of normal colors, thus obliging each picture ele-ment to represent color images by applying 2~-bits (8-bits ~
3). Since any of those conventional personal computers and 2S microprocessors available today almost processing with - ~3~ 3~
3 _ ~-bits data width, the system proposed by the Japanese Patent Laid-Open No. 161982/198~ cannot easily be operated by applying any of these conventional personal computers and microprocessors presently available. Furthermore, this sys-S tem aims at precisely reproducing pictures by reading theoriginal image signals. To achieve this aim, this system executes a variety of processes including shading compensa-tion, gamma compensation, masking process, UCR process, di-ther process, and conversion of read image signals into binary codes, and as a result, this system unavoidably needs to execute complex data processing operations using compli-cated system constitution.
The latter image processing device proposed by the Jap-anese Patent Laid~Open No. 57573/198~ first detects edges of the original picture by applying an edge-detection operator such as Laplacean operator or a differential operator before eventually sharpening edges of images by modifying the den-sity value of picture elements in the periphery of picture edges. As a result, if characters or lines having extremely thin configuration are present, the system related to the above art cannot easily detect the substance of fine char-acters or lines, and in addition, slnce it is difficult for this system to correctly determine the density value of fine characters or lines, it cannot easily determine the modified density value of picture elements in the periphery of ~3t~7~

picture edges, thus resul~ing in the difficulty for the sys-tem to securely reproduce edges having sharp~L contrast effect.
Inventors of the present invention followed up studies on color graphs wlich are substantially artificial images and eventually discovered a variety of characteristics de-scribed below. Note that, of a variety of colored images, by observing these images from clearly visible distance, color graph represents a specific colored area in which gradual color variation is not presented.
(1) A pieture contains ten and several number of colors in all.
(2) Colors presented in respective areas are uniform and contain certain meanlngs ~information).
(3) Color graphic pieture eontains eolored charaeters and fine lines AS well.
Accordingly, if a specifie proeess identieal to that is applicable to a full-color pieture is also applied to color graph, a variety of problems take place, which are described below.
(1) Any conventional color scanner made available for an input device provides 256 units of tone wedge per each component of three primary colors, while 24 bits ~8-bits x 3) of data are needed for each picture element. In addi~
tion, each eolor scanner must be provided with practically ~3~)~73S

workable linearity throughout 256 tone wedges, thus even-tually resulting in the expensive cost.
(2) Although the color scanner needs a complex con-stitution, optical characteristics of each color scanner like spectro-sensitivity distribution for example, and phys-ical characteristic like aperture size substantially being iris diaphragm for example, are different from each other.
Actually, image signals received from various input devices do not always match certain colors. Note that spectro-sensitivity distribution is the distribution of output sig-nals against light having a specific frequency.
(3) When converting an image presented in a specific area where the image color is visually uniorm into a signal using a color seanner, the image-eonverted signal doesn't show a eonstant value. In addition, even if there is sueh a speeific area where eolor graph visually remains in uniform eolor effect, a~ter deliveriny color via an output unit, it may eventually be determined that uneven color effect is still present in this area.
(4) Extraction of characters and fine lines from a specific area involves a certain d1fficulty. More particu-larly, since the dynamic range which is substantially the object of quantization is relatively wide, the image sîgnal from any conventional color scanner doesn't show a constant value in a specific portion which is visually seen uniform -13~73S

by human eyes. Conse~uently, if the portion visually seen uniform by human eyes should be extracted, it is necessary for the operator to implement any processing operation such as smoothing process for converting fine structure into a widely visible range for example in order that the density ; value can be stabilized in the needed portion by means of compensation. However, if smoothing process is applied, fine configuration of characters and fine lines cannot pro-perly be held unaffected. To compensate for this, it is necessary for the color graphic processing system to pre-; liminarily apply masking process to finely composed charac-ters and fine lines, which in turn results in the compli-cated processes as a whole~
(5~ Data input unit i9 provided with capacity for pro-ducing a-bits and 256 tone-wedge color data signals in con-junction with each component of three primary colors. On the other hand, data output unit is provided with such a low capacity or delivering a maximum of ~-bits color data sig-nals per each component of three primary colors. Normally, it is allowed to output about 1-bit color data signals. As a result, although the data input unit can produce 16 mil-lion colors expressed by 8-bits data per picture element, the data output unit can merely output 8 colors expressed by 1-bit data per picture element. This unavoidably generates a significantly large gap in the amount of information ~3C~(~73~i between the chromatic resolution of the data input unit and the chromatic expresslon capacity o~ the daka output unit.

SUMMARY OF THE INVENTION
The primary object of the present invention is to provide a novel system which securely executes image proaessing operations including encoding and decoding of color graphic slgnals using a small amount of data.
In accordance with the present invantion there is provided a color graphic image processing method comprising ~he steps o~-reading a color graphic image by utilizatlon of a colorimage input device with the color image input device trans~orming that which is read into image data, aoding the lmage data in accordanae with section dividing means whiah aats to compress the image data by dividing a density region o~ each color in the color graphic image into sections represented by 4 through 6 codes with the codes being based upon data obtalned by measuring a plurality of reference aolor samples with said color image input device, convertlng the coded data to a color image output signal in accordance with conversion mean~ ln which a code corresponding to a reference color sample is related to a aolor image output signal which is determined so as to cause a color image output device to output a color substantially the same as said referenae color sample, `' - 130~35 - 7a - 72413-1 providing the color lmage output signal to the color image output device to reproduce a color graphic image.
The color graphic image processing system related to the present invention selects a novel signal-encoding system designed for absorbing the input characteristias such as the optical and physical characteristics of individual data input devices.
The color graphic imaga processing system selects a 1~ 73~;

novel signal decoding system designed for absorbing output characteristics of individual output devices.
After encoding read-out signals into 4 through 6 units ~` of encoded words per color, the color graphic image process-ing system related to the present invention may improve the data-compression efficiency by first encoding those encoded words into run-length codes, or the color graphic image pro-cessing system may detect edges at a specific portion where the encoded words vary themselves so that sharper edges can be gene~ated.
The color graphic image processing system related to the present invention first reads the color graphic image to `' be processed using a color graphic input device and then en-codes the read-out image data into a relatively less number of encoded words, i.e., 4 throu~h 6 encoded words per èach color usin~ a data input device in order that input signals having a relatively less bit number can be generated. The color graphic image processing system then compresses en-coded data having a relatively less bit number for storage in a small-capaclty storage device. After transmitting and decoding these data, the image processing system then con-verts these data into signals suited for the output device before eventually allowing the output d~vice to deliver com-pletely processed image signals.
By selecting a specific encoding system designed for ~3~(~73~;
g absorbing the input characteristics such as the optical and physical characteristics of individual input devices, inde-pendent of the difference of the input characteristics, the color graphic image processing system can correctly express a certain color by applying a specific encoded word.
Likewise, by selecting a specific decoding system de-signed for absorbing the output characteristics of individ-ual output devices, independent of the difference of the output characteristics of individual output devices, the color graphic image processing system can correctly repro-duce a certain color corresponding to a certain encoded word. In addition, by further converting the encoded data into run-length codes, the color graphic image processing system can more effectively compress data. Edges of color lS graphic image can effectively be sharpened by detecting edge picture elements in which encoded words vary themselves, the color graphic image processin~ system can refine the color graphic image to be processed by recognizing edge as the border.
More particularly, as described earlier, color graph has three characteristics. Of these, a further explanation is given to the second characteristic below. This is such a characteristic in which a certain amount of information is present when color in respective areas significantly differ themselves from those colors present in other areas. In ~3~735 other words, color graphic has an inherent characteristic whose color in a certain area do not contain a significant amount of information~ This is very clear in the light of the following four factors.
(1) Color graph is provided with various colors for improving visual identification.
~2) Color tones of identical images differ from each other according to the kinds and characteristics of output devices.
(3) Image signals of identical images differ from each other according to the kinds and characteristics of input devices.
~) Even if image delivered from a certain output de-vice are in a specific color equivalency relationsnip with the input images under a certain condition, if external factors like illumination e~fect for example vary them-selves, due to color rendering characteristic, the color equivalency relationship cannot be established. Note that the color equivalency relationship indicates such a specific relationship in which two colors are visually identical to each other by human eyes' observation even though distribu-tion of spectrum reflection rate physically differs from each other. Color rendering characteristic indicates a spe-cific light source characteristic affecting the visibility of colors of the illuminated ob~ect.

~3~'73~

Inventors of the present invention also confirmed the following: Physically, the number of colors of full-color images mainly reproducing natural conditions needed for human eyes to sufficiently appreciate natural views is con-sidered to be 214 (two-fourteenth power1 per picture ele-ment. However, normally, a maximum of 20 colors per picture element are made available for generating colors of color graphic. This proves that redundancy is obviously too high when employing any conventional system which uses an enor-mous amount of date correspondin~ to 214 bits for merelyreproducing a maximum of 2~ colors per picture element~ It is clear that color yraphic colors can be sufficed by merely stabilizing visibility of colors in other areas without sig-nificantly varying colors themselves.
~ased 011 the knowledge menkioned above, the color gra-phic image processing system embodied by the present inven-tion reduces essential functions to be executed by input devices by providing each color with a maximum of 3-bits data by encoding read-out data into 4 through 6 units of binary words per color. In addition, in order to allow spe-cific binary words encoded by the preceding operation to be applied to a specific area of color graph constantly, the color graphic image processing system reduces load from data processing devices which compress data and extract desig-nated areas. Furthermore, the system related to the present ~` 13~(~7~S

invention effectively prevents the output image from incurr-ing significant difference from each other otherwise to be caused by inherent characteristics of output devices.

S BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be better understood from the detailed description given hereinbelow and the ac-companying drawings which are given by way of illustration only, and thus are not limitative of the present invention in which:
FIG. 1 is the simplified block diagram denoting the constitution of one of the preferred embodiments of color graphic image processing system related to the invention;
FIG. ~ is the simpliied block diagram denoting the constitution of scanner;
FIG. 3 is the operation ~lowchart denoting flow of the preparation of table needed for encoding sectioning system for generating correct encoded words corresponding to refer-ence samples;
FIG. 4 is the operation flowchart deno~ing detailed algorithm needed for establishing the encoding section divi-sion program shown in FIG. 3;
FIG. 5 is the operation flowchart denoting flow of pre-paring table needed for decoding the encoded words;
FIG. 6 is the operation flowchart denotin~ flow of ~3~35 preparing table needed for supplying image signals suited for respective output devices to optional output devices;
FIG. 7 is the operation flowchart denoting flow of pro-cesses for encoding input image using tables prepared by processes shown in FIGS. 3 and 4;
FIG. 8 is the operation flowchart denoting the flow of processes for generating output image from codes using tables prepared by processes shown in FIGS. 5 and 6; and FIG. 9 is the chart explaining processes for sharpening edges of color graphic image.

DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now more particularly to the accompanyin~
drawings, one o~ the preperred embodiments of the present invention i.s described below.
FIG. 1 is the simplified block diagram denoting the constitution of the color graphic image processing system related to the present invention.
Data of the color graphic image to be processed is first read by an input device 1 and then supplied to an encoder 2.
Supplied data is then processed by the encoder 2 and further processed by a data-processing unit 3 for the processing and storage system before being stored in a data-storage device 4. All the needed data are transmitted to a decoder 5 of data-output device so that encoded data can be decoded by : . :

~3~735 it, and finally, image signals complete with conversion pro-cess suited for output operation are delivered to an image output device 6.
Note that any conventional color digital scanner such as drum scanner and another simplified color scanner shown in FIG. 2 can respectively be made available for the input device 1 mentioned above. When using a conventional color digital scanner like a drum scanner, read-out data signal should first be encoded by the encoder 2 before being trans-mitted to the daka-processing unit 3. On t~le other hand, when using the other simplified color scanner shown in FIG.
2, the followiIly system constitution is employed. First, an oriyinal document D is exposed by a light source 10 through a cylindrical three-prima~y-color ~ilter 11. The line sen-sor 12 then receives reflected light from the light-exposed original document D. Image data of the light-exposed ori-ginal document D is then converted into digital valu~ signal by an A/D converter 13~ which is then supplied to three-primary-color line buffers 16 through 18 via a switcher 15 which is controlled by a controller 14 together with the three-primary-color filter 11. Digitalized signal output from line buffers 16 through 18 is then applied to an en-coder 19 so that the digitalized signal can be encoded be-fore being applied to the data-processing unit 3 through a line buffer 20. The system constitution just mentioned ~3~J0~3S

above merely needs to add the three-primary-col^r filter 11 and line buffers 16 through 18 for dealing with three-primary-color to any conventional contact-type scanner for white and black color and encode digitalized image data into 4 through 6 units of encoded wordsO This in turn allows the system to dispense with such materials featuring outstanding color balance for making up the light source 10, the three-primary-color filter 11, and the line scanner 12, thus even-tually allowing the entire system to be completed inexpen-sively.
The encoder 2 encodes read-out color image data into 4 through 6 units of encoded words using any conventional col-or digital scanner like a drum scanner, whereas the system allows the A/D converter 13 to output encoded data when the simpliied color scanner shown in FIG. 2 is made available.
Table 1 represents reference samples used for determin-ing image-signal encoding method applicable to a certain scanner by applying values of three attributes of Munsell color system.
As mentioned earlier, since colors made available for color graph are provided for improving visual identifica-tion, pure colors having higher coloration effect are mainly made available. Generally speaking, the higher the colora-tion effect and brilliancy, the better the visual identific-ation. Accordingly, better visual effect can be achieved by -~3~ S

effectively designing a system capable of correctly encoding reference samples composed of pure colors.

Tinctorial colors 4R4.5/14 5Y8/13.5 4G5.5/10.5 5B4/11 6P3.5/12.5 4YR6.5/14 4GY7/12 5BG4.5/10 6PB3.5/13 6RP4/13.5 _, _ Non-tinctorial color Nl N9.5 Table 2 represents data of reference samples indicated by values of three attributes of Munsell color system, in which colors were first measured by a color luminance meter and then denoted by CIE 1964 supplementary standard color-imetric system ~hereinafter, referred to as XYZ color spe-cification).
To set up a system for encoding color data capable of absorbing input characteristic of input device, standardized XYZ color specification was used for representing reference samples without using color data directly fed from the input device. Compared to the conventional absorption curve of RGB components of three primary colors, absorption curves of X and Z respectively resemble to R and B, whereas Y can be ` ~3~3S

sought by linear approximation of RGB components.
(Reference:
Approximate expression of Y in conjunction with data of RGB components from a certain input device is shown below.

Y = 0.0893 R + 0.1690 G - 0.1027 B) _ X _ Y __Z

Nl 1.77 1.55 1.76 N9.5 85.54 86.S496.78 4R4.5/14 ~ 29.7317.94 10.30 4YR6.5/1447.3236.468.91 5~ 54.72 57.478.99 4GY7/1231.23 42.617.90 4G5.5/10,5 13.6222.36 14.35 5BG4.5/1011.2915.4125.94 5B4/1110.75 11.6232.68 __ 6PB3.5/1316.5112.3449.29 6P3.5/13.5 22.16l4.92 31.75 6RP4/13.525.9516.4419.46 Using the above expression, it is possible for the sys-tem to convert data related to RGB components from an input device into codes of XYZ color speci~ication by applying ~ iL3~7~5 simple calculations.
Table 3 represents encoded words dealing with axes X, Y
and Z when applying XYZ color specification in conjunction with reference samples denoted by values of three attributes S of Munsell color system.
Using encoded words dealing with axes X, Y and Z the data processing device can effectively deal with encoded words combining X, Y and Z altogether by executing a simple calculation like C = X ~ 5Y ~ 25Z. In this case, C can re-presents a certain color on the XYZ color specificationsystem.

. __ X _ .Y _ Nl O O O
. _ __. _ _ . _ . . .
N9.5 4 4 4 ._ . .
4R4.5/14 3 0 0 .___ _ , __ 4YR6.5/14 4 2 0 _ .
5Y8~13.5 4 ~ 0 . __ _ 4G5.5/10.5 1 1 _ .....
5BG4.5/10 0 0 2 .
6PB3.5/13 1 0 4 __ 6P3.5/13.5 2 0 3 6RP4/13.5 2 1 ~ 13~ 35 FIG. 3 denotes ~he flow of processes needed for pre-paring a table related to the threshold value which splits the encoding divisions for generating encoded words correct-ly dealing with reference samples. The input device first reads data related to the reference sample to gain access to data containing multiple values of RGB components. Then, based on these multiple-value data, the controller prepares the table needed for splitting encoding divisions using section-division finalized program.
FIG. 4 denotes the detailed algorithm related to the section-division finalized program shown in FI&. 3. First, when step 1 is entered, the input device reads data related to the reference sample. When step 2 is entered, based on the read-out data, histogram is prepared. When step 3 is entered, the controller computes mean value of density of respecti~e color areas. Next, when step ~ is entered, the controller identifies whether a white area is present, or not, by checking to see if R = 255, G = 255 and B - 255, or not. If the absence of the white area is identified, in other words, if it is identified that no border is present, operation mode proceeds to step 5. ~hen step 5 is entered, individual data are inverted. This activates step 6 in which inversion flags are set, and then the operation mode returns to step 2 for preparing histogram.
; 25 Conversely, if the controller identifies during step 4 13~7~;

that the white area is present, operation mode then proceeds to step 7 in which maximum and minimum densities in the identical area are sought until the white area is eventually detected. When step 8 is entered, the controller then iden-tifies whether the section-division is done correctly, or not/ by checking the difference between maximum and minimum densities is less than the predetermined khreshold value, or not. If it is identified that sectional division is not correctly done, step 9 is entered, in which section is modi-fied in the manner of setting up new sections based on his-togram prepared during step 2 for example, and then the op-eration mode returns to step 7.
Conversely i it is identified during step 8 that sec-tional division has been done correctly, operation mode pro-ceeds to step 10, in which sec~ion dividing table specifyingthreshold values needed or splitting encoding section is determined. Next, when step 11 is entered, the section di-viding table determined during step 10 is output to file.
In other words, the controller computes mean value of density based on the histogram prepared in accordance with read-out data and then executes division of secti~ns using white areas for making up border areas. The controller then identifies whether the sectional division using the white-area border has been done correctly, or not. If it is in-correctly done, sectional modification is executed, and ~ ~3~73S

then, based on the eventual sectional division correctly done, the section dividing table is determined before even-tually being output to file.
FIG. 5 is the flowchart denoting flow of processes needed for preparing table for decoding encoded words.
Colors of reference samples are measured by XYZ color-measuring device before generating data of XYZ color speci-fication. The controller then prepares a codes~XYZ conver-sion table needed for converting coded data into XYZ color specification data.
FIG. 6 is the flowchart denoting flow of processes related to preparation of table needed for providing an op-tional output device with image signals suited for this out-put device. ~n adequate number of multiple-value data related to R~B components is converted into signals suited for output ~ormat before being supplied to an optional out-put device. The system then generates data related to XYZ
color specification after applying XYZ color measurement to the output color signals. The system then prepares a XYZ~
RGB conversion table needed for converting data of XYZ color specification into multiple-value data related to RGB com-ponents in accordance with the adequate number of multiple-value data related to RGB components and data related to XYZ
color specification.
The system controllPr preliminarily measures colors to 3~3~73.5 det~ct which colors can be output on receipt of data by an output device. By securely holding the relationship between data supplied to an output device and colors to be output from this output device, the system controller allows colors to be reproduced correctly.
FIG. 7 is the flowchart denoting ~low of processes needed for encoding input images using tables prepared by processes shown in FIGS. 3 and 4. Input device reads the objective image for generating multiple-value data related to RGB components, and finally it encodes the multiple-value data related to RGB components in accordance with section-dividing table prepared by algorithm shown in FIG. 4.
FIG. 8 is the flowchart denotiny flow of processes needed for generating output image ~rom binary codes using the tables prepared by processes shown in FIGS. 5 and 6.
The controller first decodes the codes data into XYZ refer-ence value in accordance with the codes~XYZ conversion table prepared by processes shown in FIG. 5 for converting codes into XYZ reference value. The controller then converts XYZ
reference value into multiple-value data related to RGB com-ponents in accordance with the XYZ ~ RGB conversion table prepared by processes shown in FIG. 6. Then, the controller processes the converted RGB multiple-value data so that it can be suited for application to the output device before eventually generating output image.

~. , ~3~73S

The output images which were eventually generated by sequential processes executed by the color graphic image processing system related to the invention includlng read-ing, encoding, and decoding of the color graphic image to be processed proved to have been generated accurately in the perfectly identical colors without causing unevenness to occur in all areas denoting identical information.
In particular, density values of fine characters and lines having e~tremely fine width cannot easily be stabil-ized, thus eventually resulting in the uneven colors anddifficulty for uni~ormly determining colors in designated areas.
According to the performance test result, since the number of encoded words is liminative as a result of gener-ating a less number of encoded words, the color graphicimage processing system related to the invention proved to have easily provided speci~ic encoded words and uniformly determined colors in the designated areas.
In addition, the performance test result proved that the image processing system provided a specific encoded word related to a certain color independent of the differences in the input characteristics of input devices by means of con-verting image signals from input devices into encoded words corresponding to XYZ color specification.
Likewise, the image processing system related to the :- ..... ~ .

invention satisfactorily output specific colors related to a certain encoded word independent of the differences in the output characteristics of output devices by means of con-verting data to be supplied to the output device into the one best suited for delivery to the output device.
Furthermore, the image processing system related to the invention generated perfectly-shaped output image without even the slightest dullness along border areas as a result of implementing processes needed for sharpening image edges.
Now, a further explanation is given in conjunction with the image-edge sharpening process below. ~ certain image has speciEic numerical value~ related to X~Z color specifi-cation of a certain image, in which the density value of this image is encoded by applying the encoding-section di-vidin~ table. As shown in FIG. 9, if the encoded words vary themselves from an area containing a large-grade encoded word to an area containing a small-grade encoded word in the direction of the main scanning line, then the image process-ing system executes a compensatory operation in order that the encoded word corresponding to the middle-grade can be made available for the encoded word of the small-grade area.
Conversely, if the encoded words vary themselves rom the area containing a small-grade encoded word to the area con-taining a large-grade encoded word, then the image process-ing system executes a compensatory operation in order that ~L3~7~

the encoded word corresponding to the middle-grade can be made available for the encoded word of the large-grade area, thus effectively sharpening image edges is achieved. In addition, since data encoded by relatively less number of encoded words contains specific characteristlcc described below, image edges can easily be sharpened.
(1) Since the portion causing the encoded word to vary itself is taken as the edge, the image processing system can easily detect the presence of the image edge without apply~
ing any edge-detection operator.
(2) Since the encoded words of a certain picture ele-ment present in narrow-width areas of ~ine characters and lineæ can be determined as a single encoded word without executing smoothing ot areas, the image processing system can easily determine the compensated encoded words of the picture element near the image edges.
When extracting an optional area, since the data encod-ed by applying a less number of encoded words contains a less number of encoded words, it is possible or the system to apply specific encoded words to the portion where the image color visually appears to be uniform. Note that, even by means of applying the encoding operation mentioned above, the system cannot apply specific encoded words to such an area containing a specific density value very close to the encoding-section threshold value related to a certain ~3Q~7;~S

encoded word when it is encoded. As a result, basically, a specific process should be applied to this unable area or applying specific encoded words by using a processor. How-ever, actually, after applying the above processes to a num-ber of images, inventors confirmed that there was no sub-stantial needs for implementing those processes mentioned above by applying a specific processor. Consequently, in-ventors conclude that there is no substantial disadvantage to be incurred actually. As a result, the image processing system can correctly extract an optional area from the image which is encoded by processes mentioned above by providing the data-processor 3 with a specific encoded word like the above cited "C" as parameter ~or example made ~rom the com-bination of enaoded words in areas X, Y and Z to be ex~
tracted. The sys~em mentioned above correctly extracts an optional area by applyir~g 0 through 124 o~ integer values per each color in case using the parameter "C" mentioned above. For example, the system can extract only "red" area by applying parameter "100" corresponding to the "red" area to the data-processor 3.
Table ~ represents the results of the comparison of compression rates between four objects including the ori-ginal image, run-length encoding process, enccdi n~ process using 5 encoded data per color, and the combination of en-coding process using 5 encoded data per color and run-length ~3~73S

encoding process, respectively. Thus, Table 4 indicates that the compression rates of color graphic image can sig-nificantly be improved by applying the color graphic image processing system related to the present invention.

__ . _ . _ Amount of data Compression Images (bytes) rate Original image(1,000 x 875) x 3 Ru ~ ~ ding 2,718 912 0.97 Encoding process using 4 00 5 encoded data ~er color 37,5 6 Combination of encoding _ _ process using 5 encoded 72 3 data per color and run- ,66 36.
len~th encoding process _ _ __ __ lS As i9 clear ~rom the ~oregoing description, the color graphic image processing system related to the present invention easily determinec single applicable encoded word suited for a certain area. The system also easily absorbs differences of characteristics between each of operating input and output devices.
The system easily provides specific encoded words with-out splitting fine characters and lines, In addition, the system effectively constrains adverse effect caused by back-ground soiling, and yet, it easily extracts the desired area and sharpens image edges, thus easily and thoroughly . ~

eliminating dullness from borders of the processed image.
Furthermore, the system significantly improves the data-compression rate by combining a conventional encoding art such as run-length encoding system.
The above description merely refers to one of the pre-ferred embodiments in which encoding operation is executed by applying five encoded words per each color. However, another preferred embodiment allows use of 4 or 6 encoded words per each color. ~ore particularly, when applying 5 encoded words to respective RGB components of three primary colors, 53 (five-third power) = 125 colors can be repre-sented. These 125 colors covering three primary colors can ~ully be produced by applying 7-bits data per picture ele-ment. Thus, when using any conventLonal 8-bits computer, the remaining 1-bit data capaclty can be used for parity bit.
If 4 units of encoded words per color were applied, needed colors can be represented by applying 6-bits data per pic-ture element. Likewise, if 6 units of encoded words per color were applied, needed colors can be represented by ap-plying 8-bits data per picture element. Although provision of 5 units of encoded words is most desirable, if it is al-lowable to introduce such a particular waste as mentioned above, it is also possible for the system to set 4 or 6 units of encoded words. Furthermore, it is also possible for the system to modify the constitution in various ways . .

.. .
~L3~7~3~
- 29 ~

without departing from the essential spirit and scope of the present invention defined in the following claims.

Claims (6)

1. A color graphic image processing method comprising the steps of:
reading a color graphic image by utilization of a color image input device with the color image input device transforming that which is read into image data, coding the image data in accordance with section dividing means which acts to compress the image data by dividing a density region of each color in the color graphic image into sections represented by 4 through 6 codes with the codes being based upon data obtained by measuring a plurality of reference color samples with said color image input device, converting the coded data to a color image output signal in accordance with conversion means in which a code corresponding to a reference color sample is related to a color image output signal which is determined so as to cause a color image output device to output a color substantially the same as said reference color sample, providing the color image output signal to the color image output device to reproduce a color graphic image.
2. A color graphic image processing method according to claim (1), wherein said section dividing means includes stored information which is prepared by making available a plurality of reference color samples, assigning 4 to 6 codes with respect to the color densities of each reference color sample, measuring the color densities of the reference samples with the color image input device and dividing a density region of each color into sections corresponding with said codes by relating measured densities of reference color samples to said codes.
3. A color graphic image processing method according to claim (1), wherein said conversion means includes stored information which is prepared by making available the plurality of reference samples, coding every reference color sample, inputting a number of color image output signals to the color image output device, deciding a color among the outputted colors which is substantially the same as the reference color sample, and relating the code of the reference color sample to the color image output signal of the decided color.
4. A color graphic image processing method according to claim (2), wherein a total of 5 codes are assigned with respect to the color densities.
5. A color graphic image processing method according to claim (3), wherein a total of 5 codes are assigned with respect to the color densities.
6. A color graphic image processing method according to claim (1), wherein the coded data obtained in accordance with the - 31a - 72413-1 section dividing means is further compressed by subjecting the coded data to a run-length encoding process.
CA000530066A 1986-02-20 1987-02-19 Color graphic image processing method Expired - Fee Related CA1300735C (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP36041/1986 1986-02-20
JP61036041A JPS62193457A (en) 1986-02-20 1986-02-20 Color graph picture processing system

Publications (1)

Publication Number Publication Date
CA1300735C true CA1300735C (en) 1992-05-12

Family

ID=12458623

Family Applications (1)

Application Number Title Priority Date Filing Date
CA000530066A Expired - Fee Related CA1300735C (en) 1986-02-20 1987-02-19 Color graphic image processing method

Country Status (4)

Country Link
US (1) US4853767A (en)
EP (1) EP0234530A3 (en)
JP (1) JPS62193457A (en)
CA (1) CA1300735C (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0292284B1 (en) * 1987-05-21 1994-01-05 Sony Corporation Method and apparatus for processing display colour signal
DE3885945T2 (en) * 1987-06-24 1994-06-30 Canon Kk Color printer.
EP0309884A3 (en) * 1987-09-28 1991-04-10 Mitsubishi Denki Kabushiki Kaisha Color image display apparatus
JPH0241271A (en) * 1988-08-01 1990-02-09 Fuji Xerox Co Ltd Color image forming apparatus
DE68927970T2 (en) * 1988-09-08 1997-10-09 Canon Kk Point image data output device
JP3113659B2 (en) * 1988-09-08 2000-12-04 キヤノン株式会社 Image forming device
JPH03268563A (en) * 1990-03-16 1991-11-29 Mita Ind Co Ltd Laser beam controller
ES2118075T3 (en) * 1990-03-27 1998-09-16 Canon Kk APPARATUS OF COMMUNICATION OF IMAGES IN COLOR AND CORRESPONDING METHOD.
US5245446A (en) * 1990-10-10 1993-09-14 Fuji Xerox Co., Ltd. Image processing system
DE69217319T2 (en) * 1991-10-07 1997-07-17 Xerox Corp Image editing system and method with improved color palette editing
US5398297A (en) * 1992-07-27 1995-03-14 Tektronix, Inc. Improved color rendering method for intensity-variable printing based upon a dithering table
JPH07147639A (en) * 1993-11-22 1995-06-06 Canon Inc Device and system for forming image
US5831559A (en) * 1996-01-24 1998-11-03 Intel Corporation Encoding/decoding video signals using multiple run-val mapping tables
JPH11161782A (en) * 1997-11-27 1999-06-18 Seiko Epson Corp Method and device for encoding color picture, and method and device for decoding color picture
NL1013669C2 (en) * 1999-11-25 2001-05-28 Ocu Technologies B V Method and device for color quantization.
WO2004034323A2 (en) * 2002-10-07 2004-04-22 Summus, Inc. System for graphics compression and display
US7877827B2 (en) 2007-09-10 2011-02-01 Amerigon Incorporated Operational control schemes for ventilated seat or bed assemblies
US9125497B2 (en) 2007-10-15 2015-09-08 Gentherm Incorporated Climate controlled bed assembly with intermediate layer
CN102098947B (en) 2008-07-18 2014-12-10 阿美里根公司 Climate controlled bed assembly
US8332975B2 (en) 2009-08-31 2012-12-18 Gentherm Incorporated Climate-controlled topper member for medical beds
US10174485B2 (en) * 2016-11-23 2019-01-08 Cnh Industrial America Llc System and method for providing reconfigurable input devices for a work vehicle

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS56102853A (en) * 1980-01-21 1981-08-17 Dainippon Screen Mfg Co Ltd Metho for pretreating image signal of image copying apparatus
US4342047A (en) * 1980-04-30 1982-07-27 United Technologies Corporation Color calibration accuracy in a color raster scanner
US4307415A (en) * 1980-04-30 1981-12-22 United Technologies Corporation Color indentification circuit
US4302770A (en) * 1980-04-30 1981-11-24 United Technologies Corporation Color raster scanner for recording cartographic data
US4301469A (en) * 1980-04-30 1981-11-17 United Technologies Corporation Run length encoder for color raster scanner
JPS5957573A (en) * 1982-09-27 1984-04-03 Canon Inc Picture processor
JPS59135975A (en) * 1983-01-26 1984-08-04 Nippon Telegr & Teleph Corp <Ntt> Coding and decoding system of color signal
JPS59161982A (en) * 1983-03-06 1984-09-12 Canon Inc Picture processor
US4677649A (en) * 1983-04-26 1987-06-30 Canon Kabushiki Kaisha Data receiving apparatus
US4646134A (en) * 1984-03-21 1987-02-24 Sony Corporation Apparatus for encoding image signal
US4682215A (en) * 1984-05-28 1987-07-21 Ricoh Company, Ltd. Coding system for image processing apparatus
US4684923A (en) * 1984-09-17 1987-08-04 Nec Corporation Encoder with selective indication of compression encoding and decoder therefor
US4679094A (en) * 1986-10-14 1987-07-07 The Associated Press Method for compression and transmission of video information

Also Published As

Publication number Publication date
EP0234530A3 (en) 1990-03-21
US4853767A (en) 1989-08-01
JPS62193457A (en) 1987-08-25
EP0234530A2 (en) 1987-09-02

Similar Documents

Publication Publication Date Title
CA1300735C (en) Color graphic image processing method
EP0187911B1 (en) Method and apparatus for thresholding image data
EP0110353B2 (en) Picture signal processing system suitable for displaying continuous tone pictures
KR950009697B1 (en) Area dis criminating systems for an image processing system
US6307962B1 (en) Document data compression system which automatically segments documents and generates compressed smart documents therefrom
US4124870A (en) Method for improving print quality of coarse-scan/fine-print character reproduction
US6934419B2 (en) Enhanced compression of gray-level images
US5218649A (en) Image enhancement system
US5072291A (en) Image information signal processing apparatus for improving reproduced image quality by discriminating the type of input image and selecting a particular processing in accordance therewith
US7466455B2 (en) Image processing method and system for performing monochrome/color judgement of a pixelised image
US4837846A (en) Method of the image processing
RU2126598C1 (en) Method and device for adaptive rendering of half-tone images
US6256421B1 (en) Method and apparatus for simulating JPEG compression
JPH01112377A (en) Picture information processor
JP2760791B2 (en) Image information processing device
JPH08298589A (en) Image processor of photograph/document mixed image
JPH04236574A (en) Picture coding system
JP2694255B2 (en) Facsimile machine
JPH02123488A (en) Optical character reader
EP1432236B1 (en) Image processing of pixelised images
JPH02206970A (en) Method and apparatus for converting and coding and decoding picture signal
JP2561294B2 (en) Image coding device
JPH0575872A (en) Picture processor
KR930003482B1 (en) Simple identification method of mixed video and text documents
JPH04188952A (en) Color picture communication system

Legal Events

Date Code Title Description
MKLA Lapsed