WO1998047108A1 - Image composition method and apparatus - Google Patents
Image composition method and apparatus Download PDFInfo
- Publication number
- WO1998047108A1 WO1998047108A1 PCT/US1997/006316 US9706316W WO9847108A1 WO 1998047108 A1 WO1998047108 A1 WO 1998047108A1 US 9706316 W US9706316 W US 9706316W WO 9847108 A1 WO9847108 A1 WO 9847108A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- transformation components
- pixel
- composite
- layer
- Prior art date
Links
- 239000000203 mixture Substances 0.000 title claims abstract description 85
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000009466 transformation Effects 0.000 claims abstract description 204
- 230000005540 biological transmission Effects 0.000 claims abstract description 70
- 238000010521 absorption reaction Methods 0.000 claims abstract description 18
- 239000002131 composite material Substances 0.000 claims description 141
- 238000004364 calculation method Methods 0.000 claims description 10
- 238000009825 accumulation Methods 0.000 claims description 8
- 238000012545 processing Methods 0.000 claims description 8
- 238000003860 storage Methods 0.000 claims description 8
- 238000003384 imaging method Methods 0.000 abstract description 30
- 238000005286 illumination Methods 0.000 abstract description 3
- 239000010410 layer Substances 0.000 description 128
- 230000008569 process Effects 0.000 description 17
- 239000002356 single layer Substances 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 238000002156 mixing Methods 0.000 description 10
- 229920000298 Cellophane Polymers 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 239000003086 colorant Substances 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000014509 gene expression Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 150000001875 compounds Chemical class 0.000 description 3
- 230000001902 propagating effect Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 235000003930 Aegle marmelos Nutrition 0.000 description 2
- 240000003273 Passiflora laurifolia Species 0.000 description 2
- 235000005138 Spondias dulcis Nutrition 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012432 intermediate storage Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- BGNQYGRXEXDAIQ-UHFFFAOYSA-N Pyrazosulfuron-ethyl Chemical compound C1=NN(C)C(S(=O)(=O)NC(=O)NC=2N=C(OC)C=C(OC)N=2)=C1C(=O)OCC BGNQYGRXEXDAIQ-UHFFFAOYSA-N 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000003733 optic disk Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
Definitions
- the present invention relates generally to computerized imaging methods and apparatus, and more particularly to a novel imaging apparatus and method in which objects are composed of pixel data in which the transformation of virtual light interacting with them is specified by six explicit transformation components and three implicit transformation components.
- Painter's algorithm is the prevalent prior art in computer image composition.
- This algorithm uses a bottom-to-top "king of the mountain” approach to combine multiple overlaid opaque images. It first lays down the bottom most image (Object B), then writes over (replaces) the portions of that image that are overlaid by the image in the layer above it (Object A), and continues in this fashion until all layers have been composed (Fig. 1).
- Object B the bottom most image
- Object A writes over (replaces) the portions of that image that are overlaid by the image in the layer above it (Object A), and continues in this fashion until all layers have been composed (Fig. 1).
- Fig. 1 what is seen on an imaging display are the colors from the topmost objects on the screen.
- Two pixels (picture elements) from Object A overlap the pixels directly below them in Object B. Since Object B's pixels are drawn to the composite image first, its two pixels that underlie Object A are overwritten as Object A is added to the final image.
- Object A Alpha defines the percentage of Object A that is combined or blended with the objects below it to yield the composite.
- the alpha values for the two pixels of Object A that overlay Object B are .40, or 40% of the Object A pixels color is combined with 60% ( 1 - .4) of the Object B pixels below.
- An object in the layer directly above Object A is blended next with this A»B composite based on the Alpha layer associated with that object.
- Each layer above is then blended in turn (thus bottom-up composition) to yield the final composite image. Painter's algorithm with alpha blending is counter intuitive in the way it blends colors. This is because the metaphor it emulates is that of a painter mixing and applying paints to a canvas.
- the present invention is directed to methods and apparatus for implementing a new imaging and image composition system referred to as the ARTOIM (Absorption, Reflection, Transmission Image Composition Model and algorithms).
- the ARTOIM system is based on the real world metaphor of surface absorption, reflection, and transmission where many layers of translucent and opaque objects are composed into a final image for output to a video display or for printing.
- the algorithms are a substitution for both "Painter's algorithm” (Fig. 1), the image composition technique traditionally used in computer graphic systems, and the alpha blending algorithms (Fig. 2) used in the computer and television industries for video special effects such as cross fades between images, transparency, and color keying.
- this system is based on a light propagation metaphor that uses virtual light source illumination and the absorption, reflection, and transmission properties of the objects in the image to create the final screen image.
- a chosen color model e.g., RGB, YUV, Lab, HSV, YIQ, HLS, CMY, CMYK, etc.
- the pixels in the subject system are represented by 6 explicit transformation components (r r 2 , r 3 , t t 2 , t 3 ) and 3 implicit transformation components (a,, a 2 , a 3 ), wherein a, r and t refer to absorption, reflection and transmission, and the subscripts 1, 2, and 3 refer to the components of the chosen color model.
- ARTOIM system In accordance with the ARTOIM system, and as depicted in Fig. 3, light propagates from a distant source to the topmost object layer (viewer side) in a scene where some light is absorbed, some reflected and some transmitted. The light that is transmitted continues on to the next object layer where the process is repeated.
- the ARTOIM system algorithms calculate the total light reflected back to the viewer from all object layers. It includes the contributions from all of the modes of internal reflection between the layers which eventually reemerge and combine with the light directly reflected back to the viewer.
- the ARTOIM system algorithms calculate the total light transmitted through all of the layers of objects (including all of the modes of internally reflected light which reemerge below the last layer going away from the viewer) and implicitly all of the light absorbed by all of the layers.
- Fig. 1 is a diagram depicting the prior art Painter's Algorithm method of processing image pixel data
- Fig. 2 is a diagram depicting the prior art method of processing image pixel data using bottom-up alpha blending
- Fig. 3 is a diagram illustrating the basic concepts of the present invention
- Fig. 4 illustrates a simplified example of pixel processing in accordance with the present invention
- Figs. 5a- 13 are ray-tracing diagrams used to illustrate the present invention
- Figs. 14 and 15 are diagrams illustrating methods in accordance with the present invention
- Fig. 16 is a functional block diagram illustrating an imaging system in accordance with the present invention
- Figs. 1 is a diagram depicting the prior art Painter's Algorithm method of processing image pixel data
- Fig. 2 is a diagram depicting the prior art method of processing image pixel data using bottom-up alpha blending
- Fig. 3 is a diagram illustrating the basic concepts of the present invention
- FIG. 17-20 illustrate arithmetic logic unit schematics in accordance with the present invention
- Fig. 21 is a logic flow chart illustrating operation of the preferred embodiment
- Fig. 22 is a diagram illustrating image composition in accordance with the present invention
- Fig. 23 is a diagram schematically comparing various embodiments of the present invention with prior art imaging methods.
- the Light Sources One or more light sources that can be positioned above, below or between the layers of objects in the scene to be composed. Light sources can be defined as any color allowed in the selected color space. Light propagates from each source to the layers of objects in the scene, is absorbed, reflected and transmitted through all of the layers of objects in the image.
- the Graphic Objects Light is only reflected upward at an object's top surface and downward at the bottom surface (no internal reflection). Absorption, Reflection, and transmission properties of each object are uniform independent of the direction of light propagation.
- All modes of internal reflection between any number of layers are included in the calculation of the total amount of light that is reflected back to a viewer, that is absorbed, and that is transmitted through all of the objects in an image.
- the following simplifications are made to the assumptions above.
- Light Source A single virtual light source is assumed to be positioned at an infinite distance behind the viewer observing the scene on the image display. Illumination from the light source is uniform across all of the objects in the scene and propagates from the topmost (viewer) side of the objects to the bottom side.
- Graphic Object Properties Object surfaces are oriented orthogonal to the incident plane wave light from the light source. Reflection and transmission of the virtual light by objects is a plane wave and not a specular process. Absorption of light by objects is not evaluated directly (thus it is implicit) so that whatever percentage of light that is not reflected or transmitted by an object layer is lost from the process.
- the Viewer The viewer of the composed image is positioned between the distant light source and the topmost object in the scene.
- A' bc (d)
- the main term "A" is a transformation component which could have the following values: r: reflection of a single layer t: transmission of a single layer R: reflection of a composite of multiple layers T: transmission of a composite of multiple layers L: incident plane wave light from a light source (L.- from above and L ⁇ - from below) C: final image color that the viewer sees
- the prime (') refers to a resulting value after an iterative computation.
- the subscript "b” indicates the surface (top or bottom) at which reflections occur. Possible values are: ⁇ : reflecting from the top surface ⁇ : reflecting from the bottom surface
- the subscript "c" which may or may not be used, indicates the color component associated with the main term.
- RGB color model If the RGB color model is used the possible values are: R red component G green component B blue component
- (d) is an index that identifies the layer that the expression refers to. This index takes the values: i: the current layer being processed that lies immediately beneath layer i-1 i- 1 : the layer previously processed
- the light transformation properties of a graphic object are completely described by a three component reflection vector r and a three component transmittance vector t. Since the ARTOIM properties of objects in the preferred embodiment are uniform independent of whether the light is coming from below or above, the reflection and transmission for a single object is equivalent for light propagating in both directions.
- Two Layer Model A two layer image in the preferred embodiment (Fig. 6) is composed of object(s) in layer 1 on top and layer 2 on the bottom and receives top incident light L ⁇ from a light source above and bottom incident light L ⁇ from a light source below. Some of L ⁇ is reflected from the top of layer 1 and some transmitted through to layer 2, where again some is reflected and some transmitted. Some of L ⁇ that is transmitted through layer 1 is reflected back and forth between the two layers to finally transmit back through either layer 1 to contribute to the total reflection (R ⁇ ), or through layer 2 to contribute to the total transmission (T).
- R ⁇ total reflection
- T total transmission
- L ⁇ R ⁇ represents the sum of the light from the light source that is reflected from the top of layer 1 , and the light that is internally reflected (1 & 3 & 5 & 7 & ... times) between the layers before being transmitted back upward through layer 1.
- L ⁇ R ⁇ represents the converse of this for light that comes from a light source situated below the layers, where R ⁇ is the total reflection for light that comes from below.
- L ⁇ T and L ⁇ T represent the sum of the light directly transmitted through both of the layers, and the light that is internally reflected between the layers before finally being transmitted through the second layer in its initial direction.
- C represents the total light that a viewer sees in the interaction of the incident light with the two layers. Equation 0 expresses C for the case of two light sources (one on each side of the layers), and Equation 0a express C for the case of a single light source from above. Equation Ob expresses C for the case of a single light source from below:
- the Two Layer Model can be described mathematically by the following equations.
- the total reflection of light from the top for each color component can be expressed as:
- the reflection of a single layer is always equal (uniform) independent of the direction of propagation of the light (upward or downward), this is not true for the combined reflections (R. & R ⁇ ) for the two layers.
- R does not include the transmission term t(2) and absorption term a(2) of layer 2
- R ⁇ does not include the transmission term t(l) and the absorption term a(l) of layer 1.
- the computation of each light path that contributes to the R values involves both an even number of transmission terms (0 or 2), and an odd number of reflection terms (1,3,5,). This results in an asymmetric reflection of the light propagating in either direction.
- FIG. 7 illustrates a generalization of the two layer model described above. Here, two arbitrary layers, layer i-1 (top layer) and layer i (bottom layer), receive incident light from top and the bottom. In the ARTOIM, the calculation of R ⁇ , R ⁇ , and T for two layers completely describes those layers as if they were a single layer.
- Multi-Layer Model A multi-layer image (Fig. 9) is formed by the combination of a virtual layer i-1 (itself composed of multiple layers) and a new layer i.
- the reflection and transmission properties (R ⁇ (i-l), R ⁇ (i-l) & T(i-l)) of the virtual layer have been previously calculated.
- the total reflection and transmission (R' ⁇ , R' ⁇ & T') are calculated here as if the virtual layer and the new layer were the two layers in the two layer model above.
- Equations 4a, 5a, 6a are exact solutions requiring a minimum number of computations as each layer's contribution is computed consistent with the Multi Layer Model described above. Equations 4, 5 and 6 express the sum of the reflected and transmitted light over all modes of internal reflection and transmission between all of the layers in an image.
- n 0 to infinity, 0> ⁇ r,t ⁇ l
- n 0 to infinity, 0> ⁇ r,t ⁇ l
- n 0 to infinity, 0> ⁇ r,t ⁇ l
- Fig. 1 1 The modes of internal reflection for transmitted light are represented graphically in Fig. 1 1.
- the implementation of the algorithms for the ARTOIM consist of iteratively evaluating equations 4a, 5a and 6a on the object layers from the top down as the contribution of each layer is accumulated in the terms R' ⁇ , R' ⁇ and T'.
- the final image color (C ⁇ RGB ) that the viewer sees for a single light source from above is:
- a virtual layer can be combined with another virtual layer (Fig. 12) in the same manner as described above in the Multi-Layer Model.
- a virtual object instead of combining a single layer object (real object) (with transformation components r & t) to the accumulated virtual layer above, a virtual object (itself composed of real objects) with transformation components R ⁇ , R ⁇ & T is added.
- a virtual object itself composed of real objects
- complex images can be composed by iteratively adding or blending a combination of real (single layer) and virtual (multi-layer) ART graphic objects.
- the pixel information for real objects is stored in and manipulated in the ART Real Image Data Format consisting of the simple transformation components: r ⁇ R , r ⁇ G , r ⁇ B , t R , t G , t B .
- the pixel information for complex multi-layer virtual objects is stored in and manipulated in the ART Virtual Layer Image Data Format consisting of the complex transformation components: R ⁇ R , R ⁇ G , R ⁇ B , R ⁇ R , R ⁇ G , R ⁇ B , T R , T G , T B .
- the ART models presented here can be simplified to reduce the number of computations for imaging systems (e.g. video games, multimedia systems) that cannot afford the cost of computing the exact solutions of the Multi-Layer Model.
- the first simplification that can be made to the ART Imaging Model is to allow reflection, but no transmission (Fig. 13 - Opaque ART implementation).
- only the reflection transformation components r R , r G , r B ) are used to describe how light interacts with objects.
- the computational cost for combining each pixel by ART simplified methods may be slightly greater than that of the Painters and alpha blend approach
- the ARTOIM algorithms can be applied from the top down until either an opaque object or layer is reached, or the opacity of a layer reaches some predefined threshold value. This can significantly reduce the total number of computations in image composition with the ARTOIM.
- the composition of complex multi-layered images (as in animation and multimedia) will require fewer overall computations.
- the ART Image Model yields an attendant increase in imaging quality and a natural metaphor for imaging that is easier for the artist to apply and will easily extend to 3D imaging.
- ART Images are defined in an entirely new format that is independent of light source color and can include measured information about the transparency of image objects, a special digital photography process is required to capture images of real objects in the ART Image Data Format.
- Fig. 14 In place of the traditional process of taking a single photograph of an object to create an image data file, this embodiment of the ART image capture process uses two photographs of each object. One photograph is taken with the object posed in front of a non-reflecting black background, while another is taken with it posed in front of a totally reflecting white background. The white background is composed of a highly reflective surface like ScotchliteTM which is widely used in special effects photography.
- the black background is a highly absorbing surface, and a material like black velvet is typically used.
- the color data is then input into the computer from these photographs in a standard color data format (like "RGB"), and then converted to a light source independent format by normalizing each component with the separately measured light source color for that component.
- the light source independent image data for the two object photographs is then clipped to remove the portions of the backgrounds outside of the bounding rectangle for the objects, and then these photographs are combined into a single image file in the ART Image Data Format per the ART algorithms and methods.
- object images stored in the ART Image Data Format can then be combined into a composite image by implementing the ART algorithms.
- the light source independent ART components are multiplied by the color components of a selected viewing light source to convert them into displayable color components.
- RGB image data from two photographs of an object (against white and black backgrounds) into the ART Image Data Format.
- the Two Layer Model Algorithms described above will now be used to define the conversion of two photographs of an object scanned in RGB data format, to the ART Image Date Format.
- the data files (in RGB format) for an object that has been photographed against black and white backgrounds are represented in pictorial form in Fig. 15. In the photo on the left the object has been photographed against a black background.
- the measured light (scanned image data of object in photo 1 ) consists only of the light that is directly reflected by the object.
- photographing the object against a white background insures that all light transmitted through the object is reflected back upward through the object.
- the measured light in this case consists of both the light that is directly reflected by the object, and the light that is transmitted through it, reflected by the background and re-transmitted through it again.
- the unknown ART transformation components for Object 1 in photo 1 black background are r(l) and t(l).
- the reflection transformation components r(l) of the object are just equal to the light source normalized RGB value of the object photographed or scanned against a black background, and the transmission transformation components of the object t(l) are defined by equation 11.
- This example does not deal with the bias that results from using non-ideal black and white backgrounds that do not absorb and reflect (respectively) 100% of the light that strikes them. If more precise results are required, two additional images (of the black and white backgrounds alone) can be captured and used to yield a correction factor to compensate for this bias error.
- ART IMAGING SYSTEM There are many different ways that a system based on the ART Imaging Model and algorithms could be implemented depending on the application and the image composition speed required.
- the method of implementing an ART system could be comprised of only hardware, a combination of hardware and firmware, a combination of hardware, firmware and software, or even be implemented entirely in software.
- it is defined as a processes that sequentially operates on a single pixel from each object (or layer) in vertical alignment (see Fig. 4).
- this system starts with the topmost object pixel and combines it with the next one immediately beneath it, then proceeds to combine it with the pixel in the next layer below that, and so on, until all of the required pixels have been combined into a final display pixel.
- FIG. 16 A functional block diagram of a system to implement the ART algorithms is shown in Fig. 16. This system would operate in the following manner.
- the Timing Controller (10) initiates the image composition process by directing (over connection "k") the Pixel Address Processor (block 12) to identify next new pixel of the top-most object to be added to the image.
- the Pixel Address Processor determines the location or address at which this new pixel is stored in the Graphic Object Memory (16) and passes this address to the Pixel Access Controller (14) over connection "a".
- the Timing Controller (10) then causes the Pixel Access Controller to read over connection "b" the transformation components for this new pixel from the Graphic Object Memory (16) and the extended transformation components (R ⁇ RGB , R ⁇ RGB> T RGB ) for the previously composed or accumulated pixels (combined objects above) from the Blend Accumulation Register (18).
- the format of the transformation components passed over connection "b" is that of a single layer object, or r RGB , ⁇ RGB (as shown for the single layer being combined with the virtual layer in Fig. 9).
- the format of the components read over connection "b” is R ⁇ RGB , R ⁇ GB , T RGB .
- the Timing Controller (10) causes the Pixel Access Controller to present the new pixel and the accumulated pixel to the Transformation Component ALU (20) over connections "d” and "e".
- the Timing Controller (10) then directs the Transformation Component ALU (20) to perform the calculations defined in Equations 4a, 5 a and 6a to generate the new values for the transformation components (Rt RGB ,R ⁇ RGB> T RGB ). These new values include the effects of the new pixel blended with the other pixels already combined there.
- the Timing Controller (10) then completes one complete pixel blend cycle by causing the Transformation Component ALU (20) to write these new transformation components back into the Blend Accumulation Register (18).
- the Timing Controller (10) continues to cycle down through all of the pixels in this manner until it receives an indication from the Pixel Address Processor (12) via connection “k” that the bottommost pixel has finally been added, or an indication from the Pixel Access Controller (14) via connection “k” that the accumulated pixel has reached a threshold level of opacity (very little transparency) so that blending in the next pixel below will contribute very little to the final image.
- the Timing Controller (10) then causes (via connection “k") the Blend Accumulation Register (18) to write out the reflection transformation component (R *RGB ) of the composite pixel to the Pixel Color ALU (22) via connection "g".
- Timing Controller causes (via connection “k") the Pixel Color ALU (22) to calculate the pixel color from its transformation components and the light source color by Equation 7.
- Timing Controller causes the Pixel Color ALU (22) to output the pixel color (via channel “k") to the Graphic Display Interface (26) via channel “i”, where it is formatted for the Graphic Display (28).
- the Graphic Display Interface (26) then passes the pixel color information in turn to the Graphic Display (28) via connection "j".
- Multi Layer objects or scenes that have been composed by the ART System can be stored in the Complex ART Image Data Format to retain all of the transformation component information for these compound images. This allows the ART system to use these compound images as building blocks for even more complex images.
- the Timing Controller (10) directs the Blend Accumulation Register (18) to output the complete set of transformation components (RT R G B . R ⁇ RGB> T RGB ) to a Composed Image Storage Means (30).
- This Composed Image Storage Means can take the form of any digital storage system including, but not limited to, magnetic tape, magneto optic disk, optical disks, and solid state memory (RAM, ROM, EPROM, Flash, etc.).
- the composed or partially composed images stored in the Composed Image Storage Means can be used at a later time to either play back the stored images, or to be combined with other objects stored in the Graphic Object Storage Memory (16) to yield a new composed image.
- the Timing Controller causes the Composed Image Storage Means via connection 'k' to output the reflection transformation components (R ⁇ RGB ) of those images to the Pixel Color ALU (22) via connection "m", and repeats the control process described above for displaying the image.
- a new light source color is stored in the Light Source Color Storage Register (24).
- the Timing Controller (10) causes the Composed Image Storage Means to output the images to the Graphic Object Image Storage Means (16) via connection 'n'.
- these compound images are in the Complex ART Image Data Format
- many of the graphic objects stored in the Graphic Object Storage Means are in the Simple ART Image Data format.
- the Transformation Component ALU (20) receives its inputs from the Graphic Object Memory (16) and the Blend Accumulation Register (14) via the Pixel Access Controller (14) ( Figure 17).
- the inputs from the Graphic Object Memory are either r RGB , t RGB (simple ART Image Data Format) if the object is a single layer object, or R ⁇ RGB , R ⁇ RGB , T RGB (complex ART Image Data Format) if the new object is itself a multi-layer object or virtual layer.
- the inputs from the Blend Accumulation Register (the accumulated image to this point) are R ⁇ RGB , R ⁇ RGB , T RGB .
- the newly calculated pixel value at the output of the Transformation Component ALU is also in the complex ART format.
- the Transformation Component ALU can be made up of from one to four arithmetic logic units, depending on the number of color components in the chosen color model and the amount of parallelism that is desired.
- TCALU Transformation Component ALU
- the internal structure of an individual TCALU (Figs. 19 and 20) can consist of arithmetic units connected in series and parallel to perform the ART calculations.
- FIG. 19 demonstrates the calculations for adding a real (single layer) object to an image (Real Object Composition Equations 4a, 5a, and 6a), while Fig. 20 demonstrates the calculations for adding a virtual object to an image (Virtual Object Composition Equations 4b, 5b and 6b).
- These examples illustrate one method for rapidly computing the ART Imaging algorithms for an RGB color model. As can be seen from the diagrams, the total number of operations to perform both real and virtual object composition would be 27 multiply equivalents (9 each for Red ALU, Green ALU and Blue ALU). If the same structure as shown in Figs. 19 and 20 is used with the Two Term Real and Virtual Object Composition Equations, the total number of computations would be 18 multiply equivalents.
- one TCALU could be used to compute the transformation components of each color component in turn.
- one arithmetic unit together with internal storage registers unit could be used inside the TCALU to develop each partial term in turn until the final terms are computed.
- the first test ⁇ 4 ⁇ compares the Layer Index to the index of the bottommost layer to provide an exit to the layer composition when all the layers have been processed.
- the second test ⁇ 5 ⁇ compares the total transparency of the pixel (T) to a minimum Opaque Threshold value (e.g. 98%) to end processing of lower layers when their contribution becomes very small. This mechanism provides a means for the top-down composition to avoid processing (reading, blending and writing) large amounts of image data that will not be seen or will have a very small effect on the image.
- information for the new layer is accessed and combined ⁇ 6 ⁇ (by the ARTOIM algorithms in the previous section) with the layer(s) already contained in the Blend Accumulator.
- the Light Source color (L) is used together with the cumulative reflection ⁇ 7 ⁇ at this element to calculate the color to be displayed.
- the scenario on the left side of Fig. 22 represents a scene to be composed by the ARTOIM algorithms. This scene consists of a light source generating light which propagates through a hazy window, through blue cellophane, reflects off a golden apple, and reemerges to the viewer after passing back through the cellophane and window. Tables 1, 2 and 3 illustrate the ARTOIM calculations for the three regions in the fully composed scene shown on the right side of Fig. 22.
- the first row in Table 1 contains the RGB value of the light source, that is used to convert the final blends (R ⁇ RGB ) in Table 2 into the final colors displayed in Table 3.
- the balance of Table 1 contain the transformation components of the objects in the scene represented as percent reflection, transmission, and abso ⁇ tion. r t a
- Table 2 shows the intermediate values of the Blend Accumulator (BAcc) as each object is composed.
- the topmost layer (Hazy Glass) is read into the Blend Accumulation Register in Blend 0, and Blends 1 and 2 successively add in the blue cellophane and golden apple.
- the values of the transformation components in each row represent the blend of all of the objects or layers above with the object in that row.
- Table 3 shows the conversion of the final transformation components in the three regions to RGB colors.
- ART Imaging transformation components
- Painters Model The fundamental difference between image composition with the ART Imaging Model and the Painters Model is that the data representation in ART imaging (transformation components) records not just the color of objects, but how they interact with light. This yields an image representation that is light source independent and allows the lighting of final composed scenes to be dynamically changed yielding a result that is natural and expected.
- image composition with the ART Imaging Model can be implemented from the top down. Thus, the most important parts of an image (top layer) are composed first with subsequently less important layers combined in turn. This allows a wide range of implementations to optimize for image composition speed or quality depending on imaging system requirements.
- captured image data is normalized to extract the light source dependency and then converted to the transformation components of the ART Image Data format Images are then composed from the top down by either the Two Term or Exact ART algorithms
- the low level architecture of an imaging system based on the ART Imaging Model differs from those based on other imaging models in the data formats that are used for intermediate storage and composition of images These data formats are chosen to efficiently fit within byte, word and long word boundaries in order to minimize storage and processing by the low level graphics system just prior to image composition and display These storage formats, as well as the video composition hardware & software may use more or less bits to define each component depending on the requirements for image manipulation speed and accuracy of image representation As well, these data formats may be changed to render them compatible with various graphic output devices like a video display or a printer Table 4 illustrates the data formats typically used in current imaging systems together with potential ARTOIM formats
- Light Source Independence - Objects are described by how they transform the light interacting with them and are light source independent As the color of the light source in a scene changes, the appearances of the objects in a scene change automatically (no additional computation) m a natural, realistic way as the image is recomposed
- Transparency is a natural, inherent part of the model and algorithms, making it easy to render objects with transparency or translucency and dynamically anti-alias all objects as they are animated. Special alpha layers, or alpha fields in pixel color values (along with the attendant hardware and software complexity required to support them) are not needed. Natural and Intuitive Metaphor - These algorithms, by modeling the way people perceive the world around them, make it significantly easier and more intuitive for an artist or composer to visualize a final composite image as he/she adds new layers or objects to an image. Composition Order - Since composition order is commutative in the ART imaging methods, both front to back as well as back to front compositions are possible.
- ART methods for front to back composition significantly reduces data bus loading and the computation required to compose scenes with many object layers, because the top down composition techniques eliminate the need to compose those portions of objects which are hidden behind other opaque objects. They allow graceful degradation of the image as the complexity increases (most important part of scene is composed first - viewer side), and they allow simplified implementation of digital video effects like object transparency, anti-aliasing, cross fading, self and color keying, and tiling.
- Ideal Foundation for 3D Imaging Since the ARTOIM is built on a light ray tracing model, extension of its multiple layer 2D composition to 3D composition will be natural and metaphor consistent. These extensions may include adding directionality to the incident light, shading, and objects which extend across many layers.
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/250,070 US5638499A (en) | 1994-05-27 | 1994-05-27 | Image composition method and apparatus for developing, storing and reproducing image data using absorption, reflection and transmission properties of images to be combined |
PCT/US1997/006316 WO1998047108A1 (en) | 1994-05-27 | 1997-04-14 | Image composition method and apparatus |
EP97918663A EP0972271A4 (en) | 1997-04-14 | 1997-04-14 | Image composition method and apparatus |
JP10543845A JP2000513849A (en) | 1997-04-14 | 1997-04-14 | Image construction method and apparatus |
AU26717/97A AU2671797A (en) | 1994-05-27 | 1997-04-14 | Image composition method and apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/250,070 US5638499A (en) | 1994-05-27 | 1994-05-27 | Image composition method and apparatus for developing, storing and reproducing image data using absorption, reflection and transmission properties of images to be combined |
PCT/US1997/006316 WO1998047108A1 (en) | 1994-05-27 | 1997-04-14 | Image composition method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1998047108A1 true WO1998047108A1 (en) | 1998-10-22 |
Family
ID=26792469
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US1997/006316 WO1998047108A1 (en) | 1994-05-27 | 1997-04-14 | Image composition method and apparatus |
Country Status (3)
Country | Link |
---|---|
US (1) | US5638499A (en) |
AU (1) | AU2671797A (en) |
WO (1) | WO1998047108A1 (en) |
Families Citing this family (59)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4244391B2 (en) * | 1997-04-04 | 2009-03-25 | ソニー株式会社 | Image conversion apparatus and image conversion method |
US6226006B1 (en) | 1997-06-27 | 2001-05-01 | C-Light Partners, Inc. | Method and apparatus for providing shading in a graphic display system |
US6034694A (en) * | 1997-06-30 | 2000-03-07 | Sun Microsystems, Inc. | Method and apparatus for pixel composition |
JPH1166271A (en) * | 1997-08-11 | 1999-03-09 | Sony Computer Entertainment:Kk | Method and device for image composition |
JP3580682B2 (en) * | 1997-09-26 | 2004-10-27 | 株式会社ソニー・コンピュータエンタテインメント | Image processing apparatus and method |
US6259457B1 (en) | 1998-02-06 | 2001-07-10 | Random Eye Technologies Inc. | System and method for generating graphics montage images |
US6026143A (en) * | 1998-02-11 | 2000-02-15 | Analogic Corporation | Apparatus and method for detecting sheet objects in computed tomography data |
US6272230B1 (en) | 1998-02-11 | 2001-08-07 | Analogic Corporation | Apparatus and method for optimizing detection of objects in computed tomography data |
US6026171A (en) * | 1998-02-11 | 2000-02-15 | Analogic Corporation | Apparatus and method for detection of liquids in computed tomography data |
US6035014A (en) * | 1998-02-11 | 2000-03-07 | Analogic Corporation | Multiple-stage apparatus and method for detecting objects in computed tomography data |
US6078642A (en) * | 1998-02-11 | 2000-06-20 | Analogice Corporation | Apparatus and method for density discrimination of objects in computed tomography data using multiple density ranges |
US6076400A (en) * | 1998-02-11 | 2000-06-20 | Analogic Corporation | Apparatus and method for classifying objects in computed tomography data using density dependent mass thresholds |
US6111974A (en) * | 1998-02-11 | 2000-08-29 | Analogic Corporation | Apparatus and method for detecting sheet objects in computed tomography data |
US6067366A (en) * | 1998-02-11 | 2000-05-23 | Analogic Corporation | Apparatus and method for detecting objects in computed tomography data using erosion and dilation of objects |
US6075871A (en) * | 1998-02-11 | 2000-06-13 | Analogic Corporation | Apparatus and method for eroding objects in computed tomography data |
US6128365A (en) * | 1998-02-11 | 2000-10-03 | Analogic Corporation | Apparatus and method for combining related objects in computed tomography data |
US6317509B1 (en) | 1998-02-11 | 2001-11-13 | Analogic Corporation | Computed tomography apparatus and method for classifying objects |
EP1062555A4 (en) * | 1998-02-11 | 2001-05-23 | Analogic Corp | Computed tomography apparatus and method for classifying objects |
US6912311B2 (en) * | 1998-06-30 | 2005-06-28 | Flashpoint Technology, Inc. | Creation and use of complex image templates |
US7446774B1 (en) | 1998-11-09 | 2008-11-04 | Broadcom Corporation | Video and graphics system with an integrated system bridge controller |
US6573905B1 (en) | 1999-11-09 | 2003-06-03 | Broadcom Corporation | Video and graphics system with parallel processing of graphics windows |
US6853385B1 (en) | 1999-11-09 | 2005-02-08 | Broadcom Corporation | Video, audio and graphics decode, composite and display system |
US6636222B1 (en) | 1999-11-09 | 2003-10-21 | Broadcom Corporation | Video and graphics system with an MPEG video decoder for concurrent multi-row decoding |
US6661422B1 (en) | 1998-11-09 | 2003-12-09 | Broadcom Corporation | Video and graphics system with MPEG specific data transfer commands |
US7982740B2 (en) | 1998-11-09 | 2011-07-19 | Broadcom Corporation | Low resolution graphics mode support using window descriptors |
US6768774B1 (en) | 1998-11-09 | 2004-07-27 | Broadcom Corporation | Video and graphics system with video scaling |
US6798420B1 (en) | 1998-11-09 | 2004-09-28 | Broadcom Corporation | Video and graphics system with a single-port RAM |
US6700588B1 (en) | 1998-11-09 | 2004-03-02 | Broadcom Corporation | Apparatus and method for blending graphics and video surfaces |
US20040004623A1 (en) * | 1998-12-11 | 2004-01-08 | Intel Corporation | Apparatus, systems, and methods to control image transparency |
WO2000055813A1 (en) * | 1999-03-12 | 2000-09-21 | Collodi David J | Method and apparatus for providing shading in a graphic display system |
US6500008B1 (en) * | 1999-03-15 | 2002-12-31 | Information Decision Technologies, Llc | Augmented reality-based firefighter training system and method |
US6369830B1 (en) * | 1999-05-10 | 2002-04-09 | Apple Computer, Inc. | Rendering translucent layers in a display system |
US6525746B1 (en) * | 1999-08-16 | 2003-02-25 | University Of Washington | Interactive video object processing environment having zoom window |
JP3502024B2 (en) | 1999-09-10 | 2004-03-02 | 株式会社ソニー・コンピュータエンタテインメント | Image processing apparatus, recording medium, and image processing method |
US6975324B1 (en) | 1999-11-09 | 2005-12-13 | Broadcom Corporation | Video and graphics system with a video transport processor |
US9668011B2 (en) | 2001-02-05 | 2017-05-30 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Single chip set-top box system |
US7088375B1 (en) * | 2000-04-27 | 2006-08-08 | Adobe Systems Incorporated | Grouping layers in composited image manipulation |
US6803923B1 (en) * | 2000-05-16 | 2004-10-12 | Adobe Systems Incorporated | Determining composition order from layer effects |
US7071937B1 (en) | 2000-05-30 | 2006-07-04 | Ccvg, Inc. | Dirt map method and apparatus for graphic display system |
US6675120B2 (en) * | 2000-06-27 | 2004-01-06 | Photon Dynamics, Inc. | Color optical inspection system |
US6614431B1 (en) * | 2001-01-18 | 2003-09-02 | David J. Collodi | Method and system for improved per-pixel shading in a computer graphics system |
US6856323B2 (en) * | 2001-04-09 | 2005-02-15 | Weather Central, Inc. | Layered image rendering |
KR100914636B1 (en) * | 2001-05-29 | 2009-08-28 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | A method of transmitting a visual communication signal, a transmitter for transmitting a visual communication signal and a receiver for receiving a visual communication signal |
JP3577016B2 (en) * | 2001-08-24 | 2004-10-13 | コナミ株式会社 | 3D image processing program, 3D image processing method, and video game apparatus |
US7356453B2 (en) * | 2001-11-14 | 2008-04-08 | Columbia Insurance Company | Computerized pattern texturing |
NZ525956A (en) | 2003-05-16 | 2005-10-28 | Deep Video Imaging Ltd | Display control system for use with multi-layer displays |
US8063916B2 (en) * | 2003-10-22 | 2011-11-22 | Broadcom Corporation | Graphics layer reduction for video composition |
DE602004010777T2 (en) * | 2004-02-18 | 2008-12-04 | Harman Becker Automotive Systems Gmbh | Alpha mix based on a lookup table |
US7817302B2 (en) * | 2004-03-26 | 2010-10-19 | Lexmark International, Inc. | Optimizing raster operation functions during print job processing |
US20050213117A1 (en) * | 2004-03-26 | 2005-09-29 | Lexmark International, Inc. | Processing print jobs according to hard or easy processing zones |
US7835030B2 (en) * | 2004-03-26 | 2010-11-16 | Lexmark International, Inc. | Processing print jobs |
US20050213142A1 (en) * | 2004-03-26 | 2005-09-29 | Clark Raymond E | Optimization techniques during processing of print jobs |
US20050213119A1 (en) * | 2004-03-26 | 2005-09-29 | Lexmark International, Inc. | Processing print jobs according to size of to-be-printed objects and bands containing same |
US7385729B2 (en) * | 2004-03-26 | 2008-06-10 | Lexmark International, Inc. | Optimization techniques during processing of print jobs |
US7859716B2 (en) | 2004-03-26 | 2010-12-28 | Lexmark International, Inc. | Optimizing to-be-printed objects during print job processing |
GB2517185B (en) * | 2013-08-14 | 2020-03-04 | Advanced Risc Mach Ltd | Graphics tile compositing control |
US9805478B2 (en) | 2013-08-14 | 2017-10-31 | Arm Limited | Compositing plural layer of image data for display |
KR20160021607A (en) * | 2014-08-18 | 2016-02-26 | 삼성전자주식회사 | Method and device to display background image |
KR102384304B1 (en) * | 2019-07-15 | 2022-04-07 | 레고 에이/에스 | Rendering method and rendering device performing the same |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5239624A (en) * | 1985-06-19 | 1993-08-24 | Pixar | Pseudo-random point sampling techniques in computer graphics |
US5268996A (en) * | 1990-12-20 | 1993-12-07 | General Electric Company | Computer image generation method for determination of total pixel illumination due to plural light sources |
US5488700A (en) * | 1993-07-30 | 1996-01-30 | Xerox Corporation | Image rendering system with local, adaptive estimation of incident diffuse energy |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3095304A (en) * | 1958-05-15 | 1963-06-25 | Motion Picture Res Council Inc | Composite photography utilizing sodium vapor illumination |
US3158477A (en) * | 1959-04-02 | 1964-11-24 | Motion Picture Res Council Inc | Composite color photography |
US4100569A (en) * | 1976-11-03 | 1978-07-11 | Petro Vlahos | Comprehensive electronic compositing system |
US4409611A (en) * | 1981-09-24 | 1983-10-11 | Vlahos-Gottschalk Research Corp., (Now) Ultimatte Corp. | Encoded signal color image compositing |
US5374193A (en) * | 1983-01-25 | 1994-12-20 | Trachtman; Joseph N. | Methods and apparatus for use in alpha training, EMG training and dichotic learning |
US4625231A (en) * | 1984-04-27 | 1986-11-25 | Ultimatte Corporation | Comprehensive electronic compositing system |
US5271097A (en) * | 1988-06-30 | 1993-12-14 | International Business Machines Corporation | Method and system for controlling the presentation of nested overlays utilizing image area mixing attributes |
JP3227191B2 (en) * | 1991-05-22 | 2001-11-12 | 株式会社リコー | Image reading device |
TW225595B (en) * | 1991-09-03 | 1994-06-21 | Gen Electric | |
US5327171A (en) * | 1992-05-26 | 1994-07-05 | United Parcel Service Of America, Inc. | Camera system optics |
US5408447A (en) * | 1992-07-15 | 1995-04-18 | Polaroid Corporation | Method and apparatus for scanning of image in integral film structure |
JP3679114B2 (en) * | 1993-04-15 | 2005-08-03 | アルティマット・コーポレーション | System and method for synthesizing video signals |
-
1994
- 1994-05-27 US US08/250,070 patent/US5638499A/en not_active Expired - Lifetime
-
1997
- 1997-04-14 AU AU26717/97A patent/AU2671797A/en not_active Abandoned
- 1997-04-14 WO PCT/US1997/006316 patent/WO1998047108A1/en not_active Application Discontinuation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5239624A (en) * | 1985-06-19 | 1993-08-24 | Pixar | Pseudo-random point sampling techniques in computer graphics |
US5268996A (en) * | 1990-12-20 | 1993-12-07 | General Electric Company | Computer image generation method for determination of total pixel illumination due to plural light sources |
US5488700A (en) * | 1993-07-30 | 1996-01-30 | Xerox Corporation | Image rendering system with local, adaptive estimation of incident diffuse energy |
Also Published As
Publication number | Publication date |
---|---|
US5638499A (en) | 1997-06-10 |
AU2671797A (en) | 1998-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5638499A (en) | Image composition method and apparatus for developing, storing and reproducing image data using absorption, reflection and transmission properties of images to be combined | |
Merritt et al. | [26] Raster3D: Photorealistic molecular graphics | |
US5740343A (en) | Texture compositing apparatus and method | |
Thies et al. | Ignor: Image-guided neural object rendering | |
US5917937A (en) | Method for performing stereo matching to recover depths, colors and opacities of surface elements | |
US8244029B1 (en) | Recursive filters on GPUs | |
Zollmann et al. | Image-based ghostings for single layer occlusions in augmented reality | |
US7227555B2 (en) | Rendering volumetric fog and other gaseous phenomena | |
US6885370B2 (en) | System and method for rendering images using a Russian roulette methodology for evaluating global illumination | |
US20050219264A1 (en) | Pop-up light field | |
US6515674B1 (en) | Apparatus for and of rendering 3d objects with parametric texture maps | |
US20020113791A1 (en) | Image-based virtual reality player with integrated 3D graphics objects | |
US11830051B2 (en) | System and method for high quality renderings of synthetic views of custom products | |
CN112837402A (en) | Scene rendering method and device, computer equipment and storage medium | |
US20030090482A1 (en) | 2D to 3D stereo plug-ins | |
Xu et al. | Scalable image-based indoor scene rendering with reflections | |
Debevec et al. | High dynamic range imaging | |
US7129961B1 (en) | System and method for dynamic autocropping of images | |
JP6898264B2 (en) | Synthesizers, methods and programs | |
EP0972271A1 (en) | Image composition method and apparatus | |
US6781583B2 (en) | System for generating a synthetic scene | |
Vandame et al. | Pipeline for real-time video view synthesis | |
US20230177744A1 (en) | Augmented reality system and method for substrates, coated articles, insulating glass units, and/or the like | |
JP3235151B2 (en) | Image simulation method | |
JP2661921B2 (en) | Object surface processing method and processing device for two-dimensional representation of three-dimensional object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AU CA CN JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1997918663 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref country code: JP Ref document number: 1998 543845 Kind code of ref document: A Format of ref document f/p: F |
|
WWP | Wipo information: published in national office |
Ref document number: 1997918663 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: CA |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1997918663 Country of ref document: EP |