WO2002050779A1 - Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading - Google Patents

Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading Download PDF

Info

Publication number
WO2002050779A1
WO2002050779A1 PCT/US2001/048439 US0148439W WO0250779A1 WO 2002050779 A1 WO2002050779 A1 WO 2002050779A1 US 0148439 W US0148439 W US 0148439W WO 0250779 A1 WO0250779 A1 WO 0250779A1
Authority
WO
WIPO (PCT)
Prior art keywords
opacity
voxel
voxels
volume
revised
Prior art date
Application number
PCT/US2001/048439
Other languages
French (fr)
Inventor
Dmitriy G. Repin
Mark S. Passolt
Original Assignee
Schlumberger Holdings Limited
Schlumberger Canada Limited
Services Petroliers Schlumberger
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Schlumberger Holdings Limited, Schlumberger Canada Limited, Services Petroliers Schlumberger filed Critical Schlumberger Holdings Limited
Priority to GB0314041A priority Critical patent/GB2386811B/en
Priority to AU2002236628A priority patent/AU2002236628A1/en
Priority to CA002432090A priority patent/CA2432090C/en
Publication of WO2002050779A1 publication Critical patent/WO2002050779A1/en
Priority to NO20023903A priority patent/NO324124B1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V1/00Seismology; Seismic or acoustic prospecting or detecting
    • G01V1/28Processing seismic data, e.g. analysis, for interpretation, for correction
    • G01V1/34Displaying seismic recordings or visualisation of seismic data or attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing

Definitions

  • This invention relates generally to the field of computer graphics. Particularly, this invention relates to volume rendering. More particularly, this invention relates to the display of three-dimensional (3D) data on a two-dimensional (2D) display with shading and opacity to control the realistic display of volumetric data in voxel format.
  • 3D three-dimensional
  • 2D two-dimensional
  • Volume rendering is an important area of computer graphics. It is employed in a wide variety of disciplines, including medicine, geology, biology and meteorology. Volume rendering allows a user to look inside an object and see features that were otherwise shielded by the rendering of the surface features of the object.
  • One patent teaching volume rendering using voxels is U.S. Patent No. 6,304,266, issued October 16, 2001, entitled “Method and Apparatus For Volume Rendering” which is incorporated herein by reference.
  • volumetric data is shown as consisting of a three-dimensional (3D) dataset of elements called "voxels" 102.
  • the voxels 102 are uniformly distributed throughout a volume 104.
  • Each voxel 102 has a position in the volume and has associated with it information such as color, illumination, opacity, velocity, amplitude, etc.
  • the information associated with each voxel 102 is produced by such disciplines as medicine (e.g., CAT scans), biology (confocal microscopy), and geoscience (seismic data).
  • Fig. 2 is shown how data values of voxels 102 are typically stored in a storage array 202.
  • the position of a particular voxel in the volume is inherent in its location in the array.
  • array position 204 might be associated with a point 106 (Fig. 1) in the volume that is a specified distance from a specified corner of the volume.
  • a single value is stored in the array 202 for each voxel 102, although it is also possible to store more than one value for each voxel 102, such as for color, illumination, opacity, velocity and amplitude.
  • Figs. 3, 4, and 5 show 2D texture rendering subdividing volumetric data into slices.
  • 2D texture rendering organizes the slices into three sets of slices 302, 402 and 502 along three different orthogonal axes.
  • the voxels are partitioned among the sets of slices 302, 402 and 502 and into cells containing multiple voxels in each slice. The partitioning is done based on the position of the voxels in array 202.
  • 3D texture rendering typically slices the volume perpendicular to the viewing direction.
  • the rendering is then accomplished on a slice-by-slice basis, moving from the rear-most slice 304, 404 and 504, respectively, to the front-most slice 306, 406 and 506 respectively.
  • the set of slices that is chosen and processed is the set whose axis makes the smallest angle to the viewing direction. While a new image is rendered it is blended with the previously drawn scene, creating perception of a 3D body.
  • 2D texture rendering organizes the slices along one of the three volume dimensions, while 3D texture rendering slices the volume perpendicular to the viewing direction, which improves image quality, but requires interpolation between the volume data points. Such interpolation is usually performed by specialized graphics hardware. Figs.
  • the texels are stored in a data buffer 602 (block 706).
  • the texel value is an indication of the RGB colors (red, green & blue) to be displayed for a voxel as determined by one or more parameters dependent on the data value or values associated with the voxel and is found in a look-up table.
  • the texel data may include a value for each of the red, green, and blue (RGB) components associated with the voxel.
  • Fig. 8 shows a display device 802 upon which information downloaded with the texel data is displayed. Based on that information and the perspective requested by the user, the display device maps the texels onto pixels on a display screen 804 (block 710). As each slice is downloaded and rendered, the user sees the volume in the requested perspective. Each time the user changes the view, for example by using a software tool to rotate, translate or magnify the image of the volume, the process of downloading and rendering slices is repeated.
  • Fig. 9 is illustrated the display of a volume that shows the outside surfaces of the volume. The interior of the volume is not seen.
  • semi-transparent data is created by adding an additional factor, alpha ( ⁇ ), to each voxel along with the RGB (red, green & blue) components described above.
  • alpha alpha
  • the value of alpha of a voxel determines the opacity of the voxel.
  • Opacity is a measure of the amount a particular voxel on a slice will allow a voxel on a background slice that maps to the same pixel on a 2D display to show through.
  • the opacity of a voxel controls how the image of the voxel is blended with the images of the voxels behind it in the view being displayed.
  • An opacity value of 0.0 means a voxel is completely transparent and cannot be seen so has no effect on the color of the displayed pixel on the 2D display since it is considered to be empty; and a value of 1.0 means the voxel is completely opaque, may be considered solid and, if it has no other voxels mapped in front of it, its texel determines the color of the displayed pixel.
  • Intermediate opacity values correspond to intermediate levels of opacity, and the texel defined colors of two voxels mapped to the same pixel are mixed in conventional ways to determine the color of the pixel that will be displayed.
  • Fig 10 is illustrated an opacity tool, such as the one included in the GEOVIZ product from Schlumberger-GeoQuest, the assignee of the present invention.
  • the opacity tool is used to map vol ⁇ metric data, such as geophysical seismic interpretation, magnetic imaging, and ultra-sonography data, to see semi-transparent volumetric data therein.
  • vol ⁇ metric data such as geophysical seismic interpretation, magnetic imaging, and ultra-sonography data
  • the value of each voxel is not only mapped to a color defined by its texel but also with an opacity defined by alpha ( ).
  • Fig. 11 is shown a display that results when the data displayed in Fig. 9 is processed using the opacity tool shown in Fig. 10.
  • the surface of the volume no longer obscures structures inside the volume as is evident when comparing Figs. 9 and 11.
  • the invention is concerned with improving the visual quality of images produced by rendering volumetric data in voxel format for the display of three-dimensional (3D) data on a two-dimensional (2D) display with' shading and opacity to control the realistic display of images rendered from the voxels.
  • One of the main shortcomings of standard opacity volume rendering of the voxel format data volumes described in the Background of the Invention is that objects displayed using a voxel data volume appear flat, which inhibits depth perception and makes it hard, if not impossible, to determine the 3D shape, orientation, and relative positions of the objects.
  • Figure 12 shows the display of an elliptical object rendered using standard opacity volume rendering of volumetric data consisting of voxels.
  • Figure 13 shows how an elliptical set of voxels will appear when displayed as a set of solid "bricks" lit with a light. The actual shape of the rendered object can now be seen.
  • the invention illuminates a voxel volume with one or more light sources.
  • the lighted volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the voxel volume.
  • Lighting parameters are computed, and graphical attributes of individual voxels are adjusted based on the orientation of opacity isosurfaces passing through each voxel in the volume.
  • the opacity of displayed voxels of a data volume may be adjusted depending upon the opacity of nearby voxels.
  • each voxel in a volume is assigned a standard opacity value ⁇ using a 3-dimentional opacity function a(ij,k) where alpha represents opacity and the letters i,j,k represent orthogonal directions.
  • Isosurfaces connect voxels with equal opacity values. For each voxel, the algorithm estimates a normal to the isosurface passing though the voxel, which is equal to the negative gradient of the opacity function at the voxel, and uses it to shade the voxel as if it were a point on the isosurface. Though results are typically shown for a single unidirectional white light source, the results can be easily extended to bi-directional light sources or colored lights.
  • Fig. 14 are shown the basic steps implemented in the rendering process that includes the teaching of the present invention, and these steps are generally described immediately below.
  • the invention is implemented only in the second and third steps, in blocks 1402, 1403 and 1404.
  • the initial opacity value of each voxel in the input voxel data volume is processed with an opacity tool, such shown in Fig. 10, to make most of the data transparent when displayed, as previously described, and the structures inside the volume are no longer obscured as can be seen in Fig. 11.
  • the initial opacity value for each visible voxel in the data volume is converted to a standard opacity ( ⁇ ) value.
  • the standard opacity value ⁇ for each visible voxel is converted to a new "visible opacity" ⁇ value in accordance with the teaching of the present invention.
  • a new visible gradient value is calculated for every visible voxel in a volume using the new visible opacity ⁇ values in accordance with the teaching of the present invention.
  • the new visible gradient value calculated for each voxel accounts for degenerate cases in numeric gradient computation and are used in all further rendering computations. Only visible cells, selected using the opacity tool and having a new visible opacity ⁇ calculated, have their opacity ⁇ value used to compute a new visible gradient, G , for each voxel.
  • the new visible gradient, G for each voxel are used to shade and render the displayed voxels.
  • ambient and diffuse shading is computed that would be applied to an opacity isosurface passing through the voxels in an ambient and diffuse illumination model wherein the voxel volume is illuminated with one or more light sources (typically directional and bi-directional).
  • the direction of the negative visible gradient vector G serves as a normal to the isosurface.
  • Special treatment is added for voxels inside opaque areas of the volume based on specifics of volumetric geoscience data, which improves a user's perception of the rendered image.
  • a lighted and shaded volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the volume.
  • the colors of all visible voxels in the volume are modified by applying the shading computed in block 1404.
  • a computer is used to process voxel volume data, and the invention comprises an article of manufacture comprising a medium that is readable by the computer and the medium carries instructions for the computer to perform the novel process of calculating the new visible opacity and the visible gradient for each voxel as described in the previous paragraphs.
  • Fig. 1 is a representation of voxels within a cubic volume
  • Fig. 2 is a representation of the storage of voxel values in a memory
  • Figs. 3-5 illustrate the partitioning of a volume into slices in orthogonal planes for 2D texture rendering
  • Fig. 6 is a representation of a volume rendered image without shading
  • Fig 7 is a block diagram of the steps in a volume rendering process
  • Fig 8 is a block diagram of volume rendering equipment
  • Fig 9 is a rendering of an opaque volume
  • Fig 10 is a representation of an opacity tool
  • Fig 1 1 is a rendering of a semi-transparent volume after an opaque volume is processed using an opacity tool
  • Fig. 12 shows the display of an elliptical object rendered using standard opacity volume rendering of volumetric data consisting of voxels without shading
  • Fig 13 shows how an elliptical set of voxels would appear when displayed as a set of solid "bricks" lit with a light;
  • Fig. 14 shows in block diagram form the steps of processing and displaying volumetric geoscience data consisting of a volume of voxel data;
  • Figs. 15 - 17 are exploded views of volumes each showing a cell made up of voxels that is used to calculate new visual opacity and visual gradient values for voxels;
  • Figs. 18- 20 are tables showing how the new visual opacity values for voxels are calculated.
  • Figs. 21 - 23 are tables showing how the new visual opacity values are used to calculate new visual gradient values for voxels.
  • Volumetric data volume is first partitioned into slices as illustrated in Figs. 3, 4 and 5, and the slices each contain rows and columns of voxels, not seen in these Figures but seen in Figures 15 - 17.
  • Volume rendering of the data may be speeded by reducing the number of voxels that are processed, downloaded and rendered by eliminating transparent voxels. Such volume rendering is taught in the previously mentioned U.S. Patent No. 6,304,266 that is incorporated herein by reference.
  • Figs. 15 - 17 are shown a representative volume of voxels formed into a plurality of slices A - E and cells, as particularly applied to geoscience data, voxels are grouped into cells containing no fewer than twenty-seven alpha-numeric designated voxels (eg. Al 1, C21) and forming a cube. Other undesignated voxels shown in these Figures are other voxels of the data volume in which the cells being described are located. Only one cell is shown in each of these Figures for simplicity, but in reality there would be many slices and many cells in each slice of a volume. These cells are described in greater detail hereinafter.
  • alpha-numeric designated voxels eg. Al 1, C21
  • the slices are shown separated from each other only for ease in seeing the twenty-seven voxels Al 1-A33, Bl 1-B33 and Cl 1-C33 that make up cells one and three in Figs. 16 and 17 respectively, and the twenty-seven voxels Bl 1-B33, Cl 1-C33 and Dl 1-D33 that make up cell two in Fig. 16.
  • Figs. 15, 16 and 17 only horizontal slices are specifically shown in Figs. 15, 16 and 17 but there are also vertical slices, and the intersecting orthogonal slices through the data volume are used to create the three dimensional rows and columns of voxels, as shown, so that, in this particular case, the vector / arrow direction of the light source is parallel to one of the three orthogonal axes (u,v,w) of the sliced data volume while the other two axes are orthogonal to the direction of the light source vector.
  • the light vector is parallel to axes W
  • Fig. 17 the light vector is parallel to axes V.
  • the .method employed supports arbitrary directions for the light source.
  • the value of initial opacity, ⁇ , for the voxels in each cell are used to calculate the values of visible opacity ⁇ only for the voxels in each cell as described hereinafter with reference to Figs. 18 - 20.
  • a different cell partitioning size might be appropriate for other data, such as medical or meteorological data.
  • An example of another cell size may be one-hundred, twenty-five voxels forming a cube of 5x5x5 voxels. The number of voxels in a cell is based on the distribution of data in the volume.
  • Fig. 15 cell one the particular voxel of interest is the center voxel B22. In Fig. 15 light is incident on the top of the data volume and cell one and is represented by the light source vector / arrow.
  • Fig. 16 cell two the particular voxel of interest is the center voxel C22. In Fig. 16 light is incident on the top of the data volume and cell two and is represented by the light source vector / arrow.
  • cell three is made up of the same voxels as cell one, but light is incident on the right side of the data volume and cell three as represented by the light source vector / arrow.
  • the initial opacity value for each voxel in a data volume is first mapped with an opacity tool, such as the one illustrated in Fig. 10 and that is included in the GEOVIZ product of Schlumberger-GeoQuest, to obtain a "standard opacity" value ⁇ for each voxel in a data volume.
  • voxel Al 1 is semi-transparent (initial opacity value between 0.0 and 1.0) the opacity value of voxels Bl 1 and Cl 1 behind voxel Al 1 cannot be any less.
  • voxel Al 1 is semi-transparent with an opacity value of 0.7
  • voxels Bl 1 and Cl 1 cannot have a visible opacity value any lower than 0.7 and are changed accordingly in calculating visible opacity values ⁇ for voxels Bl 1 and Cl 1.
  • Figs. 18 - 20 are tables used to simplify and more clearly show the mathematical calculations performed by the above equations to derive the new visible opacity values ⁇ , respectively, for all twenty-seven voxels in each of cells one, two and three using the "standard opacity" value ⁇ of the voxels in accordance with the teaching of the present invention.
  • " ⁇ " indicates standard opacity and when used as ⁇ Al 1 indicates the standard opacity of voxel Al l
  • indicates the new visible opacity and when used as ⁇ Al 1 indicates the visible opacity of voxel Al 1.
  • Fig. 15 Although only one cell is shown in Fig. 15 there are in reality many cells in a data volume and the visible opacity ⁇ is determined for all voxels in the data volume. This is done by having each voxel at the center of a cell and performing the calculations described above. The voxels on the outer surface of the volume can be disregarded.
  • Fig. 19 is a table showing how standard opacity ⁇ values of each of the voxels in cell two (Fig. 16) are used to obtain the new visible opacity ⁇ values for each of the voxels in cell two.
  • Fig. 16 Although only one cell is shown in Fig. 16 there are in reality many cells in a data volume and the visible opacity ⁇ is determined for all voxels in the data volume. This is done by having each voxel at the center of a cell and performing the calculations described above. The voxels on the outer surface of the volume can be disregarded.
  • Fig. 17 is cell three that has the same voxels as cell one shown as in Fig. 15 but the direction of light on the data volume and cell three is from the right side, rather than from the top. Accordingly, calculations for visible opacity ⁇ are identical in manner but are different.
  • the columns of three voxels are on their side through cell three. For example, voxels B23, B22, B21 and voxels C13, C12, Cl 1.
  • the tables for calculating visible opacity ⁇ for all voxels in cell two are shown in Fig. 20. In view of the previous description of how these calculations are made with reference to the tables in Figs. 18 and 19, the description is not repeated here for the sake of brevity.
  • the visible opacity ⁇ of all the voxels in cell three are used to calculate the visible opacity gradient G only for voxel B22 in the center of cell three as described hereinafter with reference to Fig. 23. Again, there are many cells in the data volume in Fig. 17 and visible opacity is determined for all voxels.
  • Figs. 21 - 23 are tables used to describe the mathematical calculations performed to derive the three gradient components G u , G v , and G w that define a new visible opacity gradient G for only the voxel in the center of each of representative cells one, two and three in accordance with the teaching of the present invention.
  • the gradients must be derived for all voxels in a data volume so similar tables are derived for the other voxels in a data volume, not just cells one, two and three.
  • the three gradient components are calculated for every voxel using the newly calculated value of visible opacity ⁇ for all voxels in each cell. For cell one in Fig. 15 the center voxel is B22; for cell two in Fig. 16 the center voxel is C22; and for cell three in Fig. 17 the center voxel is B22.
  • the new visible opacity gradient G for all voxels are then used to render the voxel data volume in a manner well known in the art. These calculations are repeated for each voxel in a ⁇ volume and are described in greater detail hereinafter.
  • the new visible opacity ⁇ values calculated for each of the twenty-seven voxels Bl 1 through D33 in cell two are used in the equations in Fig. 19 to calculate gradient components G n , G v andG,,, of the visible opacity gradient vector G only for voxel B22 in the middle of cell two.
  • the gradient components are then combined to get the negative visible opacity gradient for voxel B22.
  • the visible opacity ⁇ values for the twenty-seven voxels Al 1 through C33 in cell three are used in the equations in Fig. 20 to calculate vector components G U ,G V and(z, constituting of the visible opacity gradient vector G only for voxel C22.
  • a negative opacity gradient, G , at a particular voxel is determined by three partial derivatives along the three major axes, G n , G v , and G w as:
  • G v ⁇ a(i,j + l,k)- ⁇ a(i,j-l,k) i,l ⁇ i,l ⁇
  • G w ⁇ (i,j,k + l)- ⁇ c (i,j,k -l) j ⁇ ,J
  • these three standard gradient equations for calculating vector components G u , G v and G w are modified to calculate a new negative "visible opacity gradient" vector G , shown in the following the equations, by using the new visible opacity ⁇ values, rather than standard opacity values ⁇ shown in the equations immediately above. Substituting a value of visible opacity ⁇ in the standard gradient equations, the gradient equations then become:
  • G v Y j ⁇ (i,JA-l,k)- j ⁇ (i,j-l,k) i,k i,k
  • G w ⁇ (i,j,k + l) - ⁇ (i,j,k -l) where ⁇ (ij, k) are the visual opacity of a single voxel along each its three major axes i,j,k.
  • G (-G ,-G v ,-G w ) where the vector components G u , G v andG,,, are calculated using the tables in Figs. 21 -
  • the new visible opacity gradient, G for each of voxels B22 (cell one), C22 (cell two) and B22 (cell three) are then used to calculate ambient and diffuse shading intensity ' for those voxels in a manner previously Icnown, with the addition of special treatments provided by this invention, and generally described with reference to block 1404 in Fig. 14.
  • Shading is computed as if it were applied to an opacity isosurface passing through all voxels in a volume in an ambient and diffuse illumination model wherein the voxel volume is illuminated with one or more light sources (typically directional and bidirectional).
  • the direction of the negative visible opacity gradient vector serves in this case as a normal to the isosurface.
  • a lighted and shaded volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the volume.
  • the ambient and diffuse shading intensity is calculated as follows: if G L > 0
  • I shading J ambient + I diffuse [ G ' N ⁇ l'm(G ) else if G -L ⁇ O
  • the voxel is treated as positioned within a homogeneous voxel body.
  • I shadin 1
  • the shading intensity After the shading intensity has been computed, including the special treatment in scenarios (a) - (d), it is applied to a color-coded opacity volume. This was referred to with reference to block 1405 in Fig. 14. There are multiple ways known to implement this. Often, the color of a voxel is derived from the associated data value, ⁇ , using a color palette, also called a look-up table or LUT, as:
  • (r,g,b,a) LUT( ⁇ ) where LUT is essentially a one-dimensional array of r,g,b 5 ⁇ quadruples indexed by the data value, ⁇ .
  • LUT is essentially a one-dimensional array of r,g,b 5 ⁇ quadruples indexed by the data value, ⁇ .
  • the final color of the voxel depends not only on the data value, but also on the shading intensity associated with the voxel.
  • the initial formula calls for multiplication of each of the color components of each voxel by the shading intensity for the voxel. This requires many additional computations, slowing the process.
  • an extended color palette SHADINGJLUT, which is a two dimensional matrix composed of r,g,b, ⁇ columns computed for different values of shading intensity, I sha ⁇ Ung ⁇
  • an extended color palette could be used to look-up a color of a shaded voxel using the data value, ⁇ , and shading intensity, I shading , as indexes:
  • ⁇ r,g,b, ) SHADING _LUT( ⁇ ,I sl ⁇ ling )
  • the initial coloring (r,g,b) of a 3D seismic opacity volume does not depend on the volume opacity and can be specified based on any data values associated with a volume (e.g., amplitude, instantaneous frequency, etc.).
  • the color components of voxel material are modified by multiplying them with the shading intensity:
  • the results obtained from this calculation are the rendered volumetric data that is used to display the 3D seismic information volume on a 2D display device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Acoustics & Sound (AREA)
  • Environmental & Geological Engineering (AREA)
  • Geology (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Image Generation (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

A volume rendering process is disclosed for improving the visual quality of images produced by rendering and displaying volumetric data in voxel format for the display of three-dimensional data on a two-dimensional display, with opacity (1402) and shading (1404) to control the realistic display of images rendered from the voxels. The process includes partitioning the plurality of voxels among a plurality of slices with each slice corresponding to a respective region of the volume. Each voxel includes an opacity value adjusted by applying an opacity curve to the value. The opacity value of each voxel in each cell in the volume is converted into a new visual opacity value that is used to calculate a new visual opacity gradient (1403) for only one voxel in the center of each cell. The visual opacity gradient is used to calculate the shading used to modify the color of the individual voxels based on the orientation of opacity isosurfaces passing through each voxel in the volume in order to create a high quality realistic image.

Description

METHOD AND APPARATUS FOR VISUALIZATION OF 3D VOXEL DATA USING LIT OPACITY VOLUMES WITH SHADING
Field of the Invention
This invention relates generally to the field of computer graphics. Particularly, this invention relates to volume rendering. More particularly, this invention relates to the display of three-dimensional (3D) data on a two-dimensional (2D) display with shading and opacity to control the realistic display of volumetric data in voxel format.
Background of the Invention
Volume rendering is an important area of computer graphics. It is employed in a wide variety of disciplines, including medicine, geology, biology and meteorology. Volume rendering allows a user to look inside an object and see features that were otherwise shielded by the rendering of the surface features of the object. One patent teaching volume rendering using voxels is U.S. Patent No. 6,304,266, issued October 16, 2001, entitled "Method and Apparatus For Volume Rendering" which is incorporated herein by reference.
In Fig. 1 volumetric data is shown as consisting of a three-dimensional (3D) dataset of elements called "voxels" 102. Typically, the voxels 102 are uniformly distributed throughout a volume 104. Each voxel 102 has a position in the volume and has associated with it information such as color, illumination, opacity, velocity, amplitude, etc. The information associated with each voxel 102 is produced by such disciplines as medicine (e.g., CAT scans), biology (confocal microscopy), and geoscience (seismic data).
In Fig. 2 is shown how data values of voxels 102 are typically stored in a storage array 202. The position of a particular voxel in the volume is inherent in its location in the array. For example, array position 204 might be associated with a point 106 (Fig. 1) in the volume that is a specified distance from a specified corner of the volume. Typically, a single value is stored in the array 202 for each voxel 102, although it is also possible to store more than one value for each voxel 102, such as for color, illumination, opacity, velocity and amplitude.
Figs. 3, 4, and 5 show 2D texture rendering subdividing volumetric data into slices. 2D texture rendering organizes the slices into three sets of slices 302, 402 and 502 along three different orthogonal axes. The voxels are partitioned among the sets of slices 302, 402 and 502 and into cells containing multiple voxels in each slice. The partitioning is done based on the position of the voxels in array 202. In a similar fashion, 3D texture rendering typically slices the volume perpendicular to the viewing direction.
The rendering is then accomplished on a slice-by-slice basis, moving from the rear-most slice 304, 404 and 504, respectively, to the front-most slice 306, 406 and 506 respectively. The set of slices that is chosen and processed is the set whose axis makes the smallest angle to the viewing direction. While a new image is rendered it is blended with the previously drawn scene, creating perception of a 3D body. 2D texture rendering organizes the slices along one of the three volume dimensions, while 3D texture rendering slices the volume perpendicular to the viewing direction, which improves image quality, but requires interpolation between the volume data points. Such interpolation is usually performed by specialized graphics hardware. Figs. 6 and 7 show how texture value, or "texel," is determined for each voxel in each slice (blocks 702 and 704). The texels are stored in a data buffer 602 (block 706). Typically, the texel value is an indication of the RGB colors (red, green & blue) to be displayed for a voxel as determined by one or more parameters dependent on the data value or values associated with the voxel and is found in a look-up table. For example, the texel data may include a value for each of the red, green, and blue (RGB) components associated with the voxel. When all of the voxels in the slice have been processed (block 704), the contents of the data buffer are downloaded into a texture memory 604 (block 708).
Fig. 8 shows a display device 802 upon which information downloaded with the texel data is displayed. Based on that information and the perspective requested by the user, the display device maps the texels onto pixels on a display screen 804 (block 710). As each slice is downloaded and rendered, the user sees the volume in the requested perspective. Each time the user changes the view, for example by using a software tool to rotate, translate or magnify the image of the volume, the process of downloading and rendering slices is repeated.
In Fig. 9 is illustrated the display of a volume that shows the outside surfaces of the volume. The interior of the volume is not seen.
In some applications, greater flexibility is achieved by using semi-transparent data. Semi-transparent data is created by adding an additional factor, alpha (α), to each voxel along with the RGB (red, green & blue) components described above. The value of alpha of a voxel determines the opacity of the voxel. Opacity is a measure of the amount a particular voxel on a slice will allow a voxel on a background slice that maps to the same pixel on a 2D display to show through. The opacity of a voxel controls how the image of the voxel is blended with the images of the voxels behind it in the view being displayed. An opacity value of 0.0 means a voxel is completely transparent and cannot be seen so has no effect on the color of the displayed pixel on the 2D display since it is considered to be empty; and a value of 1.0 means the voxel is completely opaque, may be considered solid and, if it has no other voxels mapped in front of it, its texel determines the color of the displayed pixel. Intermediate opacity values correspond to intermediate levels of opacity, and the texel defined colors of two voxels mapped to the same pixel are mixed in conventional ways to determine the color of the pixel that will be displayed.
In Fig 10 is illustrated an opacity tool, such as the one included in the GEOVIZ product from Schlumberger-GeoQuest, the assignee of the present invention. The opacity tool is used to map volμmetric data, such as geophysical seismic interpretation, magnetic imaging, and ultra-sonography data, to see semi-transparent volumetric data therein. In those cases, the value of each voxel is not only mapped to a color defined by its texel but also with an opacity defined by alpha ( ). In Fig. 10, the user has adjusted the opacity mapping, shown graphically by curve 1002, to make transparent all voxels (alpha = 0) except those having large positive or negative values. This has the effect of making most of the data transparent when displayed, as can be seen from the histogram 1004 that reflects the distribution of the values of the voxels in the volumetric data displayed in Fig. 9. This is a technique called "standard opacity volume rendering" and allows a user to make voxels within a selected range of data values invisible, while leaving other voxels visible.
In Fig. 11 is shown a display that results when the data displayed in Fig. 9 is processed using the opacity tool shown in Fig. 10. The surface of the volume no longer obscures structures inside the volume as is evident when comparing Figs. 9 and 11.
It is also apparent from the histogram 1004 and Fig. 11 that most of the opacity- adjusted voxels are transparent and have no effect on the display. Summary of the Invention
The invention is concerned with improving the visual quality of images produced by rendering volumetric data in voxel format for the display of three-dimensional (3D) data on a two-dimensional (2D) display with' shading and opacity to control the realistic display of images rendered from the voxels.
One of the main shortcomings of standard opacity volume rendering of the voxel format data volumes described in the Background of the Invention is that objects displayed using a voxel data volume appear flat, which inhibits depth perception and makes it hard, if not impossible, to determine the 3D shape, orientation, and relative positions of the objects.
Figure 12 shows the display of an elliptical object rendered using standard opacity volume rendering of volumetric data consisting of voxels. Although it is clear that the cross-sectional shape is oval, there is no way to tell if the object is elliptical, ovoid, conical, etc. It is also impossible to tell if the object touches the flat seismic section behind it. The reason for these shortcomings is that traditional computer graphics surface lighting techniques do not work for voxels because the individual voxels have no proper surfaces to be lit.
Figure 13 shows how an elliptical set of voxels will appear when displayed as a set of solid "bricks" lit with a light. The actual shape of the rendered object can now be seen.
In an attempt to alleviate the shortcomings with rendering and displaying voxels, the viewer makes elaborate adjustments to the volume color map or the opacity values, watches the rendering progressing from back to front, moves the viewpoint or simply employs the imagination to help recreate the objects in his or her mind. The methods are very time-consuming, subjective and difficult for an inexperienced viewer to master, and they all yield unsatisfactory results.
To overcome these shortcomings the invention illuminates a voxel volume with one or more light sources. When rendered and displayed the lighted volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the voxel volume. Lighting parameters are computed, and graphical attributes of individual voxels are adjusted based on the orientation of opacity isosurfaces passing through each voxel in the volume.
To improve the display of objects the opacity of displayed voxels of a data volume may be adjusted depending upon the opacity of nearby voxels. Using an opacity tool each voxel in a volume is assigned a standard opacity value α using a 3-dimentional opacity function a(ij,k) where alpha represents opacity and the letters i,j,k represent orthogonal directions.
Isosurfaces connect voxels with equal opacity values. For each voxel, the algorithm estimates a normal to the isosurface passing though the voxel, which is equal to the negative gradient of the opacity function at the voxel, and uses it to shade the voxel as if it were a point on the isosurface. Though results are typically shown for a single unidirectional white light source, the results can be easily extended to bi-directional light sources or colored lights.
In Fig. 14 are shown the basic steps implemented in the rendering process that includes the teaching of the present invention, and these steps are generally described immediately below. The invention is implemented only in the second and third steps, in blocks 1402, 1403 and 1404. At block 1401, the initial opacity value of each voxel in the input voxel data volume is processed with an opacity tool, such shown in Fig. 10, to make most of the data transparent when displayed, as previously described, and the structures inside the volume are no longer obscured as can be seen in Fig. 11. As part of this process the initial opacity value for each visible voxel in the data volume is converted to a standard opacity (α) value.
At block 1402, the standard opacity value α for each visible voxel is converted to a new "visible opacity" β value in accordance with the teaching of the present invention.
At block 1403, a new visible gradient value is calculated for every visible voxel in a volume using the new visible opacity β values in accordance with the teaching of the present invention. The new visible gradient value calculated for each voxel accounts for degenerate cases in numeric gradient computation and are used in all further rendering computations. Only visible cells, selected using the opacity tool and having a new visible opacity β calculated, have their opacity β value used to compute a new visible gradient, G , for each voxel. The new visible gradient, G , for each voxel are used to shade and render the displayed voxels.
At block 1404, ambient and diffuse shading is computed that would be applied to an opacity isosurface passing through the voxels in an ambient and diffuse illumination model wherein the voxel volume is illuminated with one or more light sources (typically directional and bi-directional). The direction of the negative visible gradient vector G serves as a normal to the isosurface. Special treatment is added for voxels inside opaque areas of the volume based on specifics of volumetric geoscience data, which improves a user's perception of the rendered image. A lighted and shaded volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the volume. At block 1405, for both 2D and 3D textures, the colors of all visible voxels in the volume are modified by applying the shading computed in block 1404.
At block 1406 the processed, visible, voxel data volume, with shading, is displayed.
In one aspect of the invention, a computer is used to process voxel volume data, and the invention comprises an article of manufacture comprising a medium that is readable by the computer and the medium carries instructions for the computer to perform the novel process of calculating the new visible opacity and the visible gradient for each voxel as described in the previous paragraphs.
Brief Description of the Drawings
The invention will be better understood upon reading the following Description of the Preferred Embodiment in conjunction with the drawings in which:
Fig. 1 is a representation of voxels within a cubic volume;
Fig. 2 is a representation of the storage of voxel values in a memory;
Figs. 3-5 illustrate the partitioning of a volume into slices in orthogonal planes for 2D texture rendering;
Fig. 6 is a representation of a volume rendered image without shading;
Fig 7 is a block diagram of the steps in a volume rendering process; Fig 8 is a block diagram of volume rendering equipment; Fig 9 is a rendering of an opaque volume; Fig 10 is a representation of an opacity tool; Fig 1 1 is a rendering of a semi-transparent volume after an opaque volume is processed using an opacity tool;
Fig. 12 shows the display of an elliptical object rendered using standard opacity volume rendering of volumetric data consisting of voxels without shading; Fig 13 shows how an elliptical set of voxels would appear when displayed as a set of solid "bricks" lit with a light;
Fig. 14 shows in block diagram form the steps of processing and displaying volumetric geoscience data consisting of a volume of voxel data; Figs. 15 - 17 are exploded views of volumes each showing a cell made up of voxels that is used to calculate new visual opacity and visual gradient values for voxels;
Figs. 18- 20 are tables showing how the new visual opacity values for voxels are calculated; and
Figs. 21 - 23 are tables showing how the new visual opacity values are used to calculate new visual gradient values for voxels.
Description of the Preferred Embodiment
In the following description all reference numbers indicate in which Figure they are located. Thus, for example, reference number 1004 is found in Fig. 10, and reference number 1405 is found in Fig. 14.
Volumetric data volume is first partitioned into slices as illustrated in Figs. 3, 4 and 5, and the slices each contain rows and columns of voxels, not seen in these Figures but seen in Figures 15 - 17. Volume rendering of the data may be speeded by reducing the number of voxels that are processed, downloaded and rendered by eliminating transparent voxels. Such volume rendering is taught in the previously mentioned U.S. Patent No. 6,304,266 that is incorporated herein by reference.
In Figs. 15 - 17 are shown a representative volume of voxels formed into a plurality of slices A - E and cells, as particularly applied to geoscience data, voxels are grouped into cells containing no fewer than twenty-seven alpha-numeric designated voxels (eg. Al 1, C21) and forming a cube. Other undesignated voxels shown in these Figures are other voxels of the data volume in which the cells being described are located. Only one cell is shown in each of these Figures for simplicity, but in reality there would be many slices and many cells in each slice of a volume. These cells are described in greater detail hereinafter. The slices are shown separated from each other only for ease in seeing the twenty-seven voxels Al 1-A33, Bl 1-B33 and Cl 1-C33 that make up cells one and three in Figs. 16 and 17 respectively, and the twenty-seven voxels Bl 1-B33, Cl 1-C33 and Dl 1-D33 that make up cell two in Fig. 16.
For the purposes of this example, only horizontal slices are specifically shown in Figs. 15, 16 and 17 but there are also vertical slices, and the intersecting orthogonal slices through the data volume are used to create the three dimensional rows and columns of voxels, as shown, so that, in this particular case, the vector / arrow direction of the light source is parallel to one of the three orthogonal axes (u,v,w) of the sliced data volume while the other two axes are orthogonal to the direction of the light source vector. In Figs. 15 and 16 the light vector is parallel to axes W, and in Fig. 17 the light vector is parallel to axes V. Note that while the illustrations show the light direction parallel to one of the axes, the .method employed supports arbitrary directions for the light source. The value of initial opacity, α, for the voxels in each cell are used to calculate the values of visible opacity β only for the voxels in each cell as described hereinafter with reference to Figs. 18 - 20.
While the preferred cell size, as shown in Figs. 15 - 17, is twenty-seven voxels forming a cube of 3x3x3 voxels, a different cell partitioning size might be appropriate for other data, such as medical or meteorological data. An example of another cell size may be one-hundred, twenty-five voxels forming a cube of 5x5x5 voxels. The number of voxels in a cell is based on the distribution of data in the volume.
In Fig. 15 cell one the particular voxel of interest is the center voxel B22. In Fig. 15 light is incident on the top of the data volume and cell one and is represented by the light source vector / arrow. In Fig. 16 cell two the particular voxel of interest is the center voxel C22. In Fig. 16 light is incident on the top of the data volume and cell two and is represented by the light source vector / arrow. In Fig. 17 cell three is made up of the same voxels as cell one, but light is incident on the right side of the data volume and cell three as represented by the light source vector / arrow.
As briefly described above, the initial opacity value for each voxel in a data volume is first mapped with an opacity tool, such as the one illustrated in Fig. 10 and that is included in the GEOVIZ product of Schlumberger-GeoQuest, to obtain a "standard opacity" value α for each voxel in a data volume. The opacity tool adjusts the opacity mapping of voxels in the data volume to make transparent chosen voxels (α = 0) except, for example, those having large positive or negative values. This has the effect of making most of the data transparent when rendered, and structures inside the data volume are no longer obscured as can be seen in Fig. 11. This is a technique called "standard opacity volume rendering" and allows a user to make voxels within a selected range of data values invisible, while leaving others visible. The operator does this by changing the setting of the opacity tool. This technique is described in greater detail in U.S. Patent No. 6,304,266 cited above.
In a simplified treatment of visibility we, first, find the largest component of the light incident on a data volume and mark it as the "visibility direction" (e.g., u, assuming that Lu = max(Lu,Lv,Lw)). This would be the vector / arrow in Figs. 15 - 17. Thus, for a light coming from the u direction (not represented in Figs. 15 - 17), the new "visible opacity" β value is computed using the following equations:
if Jcc(i + l,j,k) > j (i -l,j,k)
J,k j,'< β(i - l,j,k) = a(i - l,j,k) β{i, j, k) = max( i( ~l,j,k), (i, j, k)) β(i + l,j,k) = mϋx(β(i,j,k),a(i + l,j,k)) else β(i + l,j,k) = a(i + l,j,k) β( j,k) = max( ?(/ + 1, j, k), a(i, j, k)) β(i -l, j, k) = max(β(i, /, k), (i - 1, j, k))
The above mathematical equations are used to calculate the visible opacity β for each voxel in each of the nine columns of three voxels in a cell, as viewed from the direction of the light, such as voxel column Al 1, Bl 1, Cl 1 and voxel column A23, B23, C23 in cell one. When a voxel column is viewed from the direction of the light source, if the- first voxel is opaque all other voxels behind it cannot be seen. For example, in cell one in Fig. 15, if voxel Al 1 is opaque, voxels Bl 1 and Cl 1 behind it cannot be seen. Similarly, if voxel Al 1 is semi-transparent (initial opacity value between 0.0 and 1.0) the opacity value of voxels Bl 1 and Cl 1 behind voxel Al 1 cannot be any less. For example, if voxel Al 1 is semi-transparent with an opacity value of 0.7, voxels Bl 1 and Cl 1 cannot have a visible opacity value any lower than 0.7 and are changed accordingly in calculating visible opacity values β for voxels Bl 1 and Cl 1.
Figs. 18 - 20 are tables used to simplify and more clearly show the mathematical calculations performed by the above equations to derive the new visible opacity values β, respectively, for all twenty-seven voxels in each of cells one, two and three using the "standard opacity" value α of the voxels in accordance with the teaching of the present invention. In the following description, "α" indicates standard opacity and when used as αAl 1 indicates the standard opacity of voxel Al l, and "β" indicates the new visible opacity and when used as βAl 1 indicates the visible opacity of voxel Al 1. Returning to Fig. 15 to describe calculating visible opacity β values for all voxels in cell one as shown in the table in Fig. 18. With light being incident on the top of the volume in Fig. 15, and the Al 1-A33 voxels being on the top of the cell, the standard opacity value of each of the voxels Al 1-A33 (αAl 1 - αA33) is equal to the visible opacity β value of each of these voxels. There is nothing in front of these voxels to block or alter the intensity of the light shining upon them, so their standard opacity will equal their visible opacity. This is represented in the "A slice" column in Fig. 18 as αAl l=βAl l, αA12=βA12 etc. through αA33=βA33.
To calculate the visible opacity β of voxel Bl 1 behind voxel Al l, and with standard opacity of voxel Al 1 (αAl 1) equal to the visible opacity βAl 1 of voxel Al l, when βAl 1 is greater than the standard opacity of voxel Bl 1 (αBl 1) behind it, then the value of visible opacity of voxel Bl 1 (βBl 1) cannot be any smaller and is changed to equal the visible opacity of voxel Al 1. That is, the visible opacity of voxel Bl 1, βBl 1, is set equal to the visible opacity value of voxel Al 1, βAl 1- (βBl 1 = βAl 1). Conversely, if the visible opacity βAl 1 is less than or equal to the standard opacity αBl 1, then the visible opacity of voxel Bl 1, βBl 1, is set equal to its standard opacity value (βBl 1 = αBl 1). Continuing down the same column of voxels Al 1, Bl 1 and Cl 1 to the Cl 1 voxel, and again using the same rationale, if the visible opacity of voxel Bl 1, βBl 1, is greater than the standard opacity of voxel Cl 1 αCl 1, then the visible opacity of voxel Cl 1, βCl 1, is set equal to the visible opacity of voxel Bl 1, βBl 1 (βCl 1 = βBl 1). Conversely, if the visible opacity of voxel Bl 1, βBl 1, is less than or equal to the standard opacity of voxel Cl 1, αCl 1, then the visible opacity βCl 1 remains equal to its standard opacity (βCl l = αCl l).
This same determination of visible opacity for all voxels in cell one is repeated for each of the other of the nine columns of three voxels in Fig. 15 (e.g. voxel columns A32, B32 & C32; A22, B22 & C22; etc.). The visible opacity β values calculated in this manner for all voxels in cell one are used for the sole purpose of calculating the visual opacity gradient for only voxel B22 in the center of cell one as is described hereinafter with reference to Fig. 21. Although other cells, such as cell two, include many of the same voxels included in cell one, the values of visual opacity will be recalculated for all voxels in each cell, and the visual- opacity values for voxels in cell two most likely may be different than those values calculated when the voxels are in cell one.
Although only one cell is shown in Fig. 15 there are in reality many cells in a data volume and the visible opacity β is determined for all voxels in the data volume. This is done by having each voxel at the center of a cell and performing the calculations described above. The voxels on the outer surface of the volume can be disregarded.
In Fig. 19 is a table showing how standard opacity α values of each of the voxels in cell two (Fig. 16) are used to obtain the new visible opacity β values for each of the voxels in cell two. With light being incident on the top of the data volume in Fig. 16, and the Bl 1-B33 voxels of cell two being on the side of the cell from which the light is coming, the standard opacity value of each of the voxels Bl 1-B33 is equal to the visible opacity value of each of these same voxels. This is represented in the "B slice" column in Fig. 19 as αBl l=βBl l, αB12=βB12 etc. through αB33=βB33.
When calculating the visible opacity of voxel C 11 behind voxel B 11 , and using the rationale described in previous paragraphs, if the visible opacity βBl 1 is greater than the standard opacity α of voxel Cl 1, then the visible opacity of voxel Cl 1 is changed to equal the visible opacity of voxel Bl 1. That is βCl 1 = βBl 1. Conversely, if the visible opacity of voxel B 11 , βB 11 , is less than or equal to the standard opacity of voxel C 11 , αCl 1, then the visible opacity βCl 1 of voxel Cl 1 is set equal to its standard opacity. That is βC 11 = αC 11. Continuing down the column of voxels B 11 , C 11 and D 11 , if the visible opacity of voxel Cl 1, βCl 1, is greater than the standard opacity of voxel Dl 1 αDl 1, then the visible opacity of voxel Dl 1, βDl 1, is set equal to the visible opacity of voxel Cl 1, βCl 1. That is βDl 1 = βCl 1. Conversely, if the visible opacity of voxel Cl l,βCl 1, is less than or equal to the standard opacity of voxel Dl 1, αDl 1, then the visible opacity βDl 1 remains equal to its standard opacity αDl 1. That is βDl 1 = αDl 1.
This same calculation of visible opacity β for all voxels in cell two is repeated for each of the other nine columns of three voxels in cell two in Fig. 19 (e.g. voxel columns B32, C32 & D32; B22, C22 & D22; etc.). In this manner the visible opacity β of all voxels in cell two is calculated. The calculated value of visible opacity β for all voxels in cell two are only used in the equations in Fig. 22 to calculate the visible opacity gradient β of voxel C22 in the center of cell two.
Although only one cell is shown in Fig. 16 there are in reality many cells in a data volume and the visible opacity β is determined for all voxels in the data volume. This is done by having each voxel at the center of a cell and performing the calculations described above. The voxels on the outer surface of the volume can be disregarded.
In Fig. 17 is cell three that has the same voxels as cell one shown as in Fig. 15 but the direction of light on the data volume and cell three is from the right side, rather than from the top. Accordingly, calculations for visible opacity β are identical in manner but are different. The columns of three voxels are on their side through cell three. For example, voxels B23, B22, B21 and voxels C13, C12, Cl 1. The tables for calculating visible opacity β for all voxels in cell two are shown in Fig. 20. In view of the previous description of how these calculations are made with reference to the tables in Figs. 18 and 19, the description is not repeated here for the sake of brevity. The visible opacity β of all the voxels in cell three are used to calculate the visible opacity gradient G only for voxel B22 in the center of cell three as described hereinafter with reference to Fig. 23. Again, there are many cells in the data volume in Fig. 17 and visible opacity is determined for all voxels. Figs. 21 - 23 are tables used to describe the mathematical calculations performed to derive the three gradient components Gu , Gv , and Gw that define a new visible opacity gradient G for only the voxel in the center of each of representative cells one, two and three in accordance with the teaching of the present invention. The gradients must be derived for all voxels in a data volume so similar tables are derived for the other voxels in a data volume, not just cells one, two and three. The three gradient components are calculated for every voxel using the newly calculated value of visible opacity β for all voxels in each cell. For cell one in Fig. 15 the center voxel is B22; for cell two in Fig. 16 the center voxel is C22; and for cell three in Fig. 17 the center voxel is B22. The new visible opacity gradient G for all voxels are then used to render the voxel data volume in a manner well known in the art. These calculations are repeated for each voxel in a ■ volume and are described in greater detail hereinafter.
Returning to cell one to describe how the visible opacity β values for each of the twenty-seven voxels in cell one are used to calculate the new visible opacity gradient G for center voxel B22. Using the visible opacity β values calculated for each of the twenty-seven voxels Al 1 through C33 in cell one, these values are used in the equations shown in Fig. 18 to calculate vector components GU , GV andGw of vector G only for voxel B22 in the center of cell one. The vector components are then combined to get vector G , the negative visible opacity gradient for voxel B22.
In the same manner, the new visible opacity β values calculated for each of the twenty-seven voxels Bl 1 through D33 in cell two are used in the equations in Fig. 19 to calculate gradient components Gn , Gv andG,,, of the visible opacity gradient vector G only for voxel B22 in the middle of cell two. The gradient components are then combined to get the negative visible opacity gradient for voxel B22. Also, the visible opacity β values for the twenty-seven voxels Al 1 through C33 in cell three are used in the equations in Fig. 20 to calculate vector components GU ,GV and(z,„ of the visible opacity gradient vector G only for voxel C22. The gradient components are then, combined to get the negative visible opacity gradient for voxel C22. The basic gradient equations are Icnown in the prior part and are described in more detail in the following paragraphs, but values of visible opacity β are used in the equations rather than values of standard opacity as previously used to derive the simplified calculations shown in Figs. 21 - 23.
A negative opacity gradient, G , at a particular voxel is determined by three partial derivatives along the three major axes, Gn , Gv , and Gw as:
G = (-Gu,-Gv,-G„ where the standard gradient equations used are:
Figure imgf000019_0001
Gv = ∑a(i,j + l,k)- ∑a(i,j-l,k) i,lι i,lι
Gw = ∑ (i,j,k + l)-∑c (i,j,k -l) j ι,J However, in accordance with the teaching of the invention, these three standard gradient equations for calculating vector components Gu , Gv and Gw are modified to calculate a new negative "visible opacity gradient" vector G , shown in the following the equations, by using the new visible opacity β values, rather than standard opacity values α shown in the equations immediately above. Substituting a value of visible opacity β in the standard gradient equations, the gradient equations then become:
Gu = β(i -l,j,k)
Figure imgf000019_0002
Gv = Yjβ(i,JA-l,k)- jβ(i,j-l,k) i,k i,k
Gw = ∑β(i,j,k + l) -∑β(i,j,k -l) where β(ij, k) are the visual opacity of a single voxel along each its three major axes i,j,k.
Only visible voxels, selected using the opacity tool, and then further processed to derive visible opacity β for each visible voxel, are used to compute vector components Gu , Gv andG,,, of negative visible opacity gradient G , preferably using the 26- neighborhood central difference method described above, at each visible voxel using the modified equations immediately above. The negative visible opacity gradient G is calculated using the equation:
G = (-G ,-Gv,-Gw) where the vector components Gu , Gv andG,,, are calculated using the tables in Figs. 21 -
23. These tables reflect the calculations in the above gradient equations.
Gradient G computed using the twenty-six neighborhood difference method have significantly more distinct values and result in smoother images than a six-neighborhood difference, although the latter is faster to compute but has only twenty-seven distinct values that result in abrupt changes of shading. For the present description we use the twenty-six neighborhood difference gradient of the opacity. These equations work regardless of the number of lights illuminating an object or the color of the light.
The new visible opacity gradient, G , for each of voxels B22 (cell one), C22 (cell two) and B22 (cell three) are then used to calculate ambient and diffuse shading intensity ' for those voxels in a manner previously Icnown, with the addition of special treatments provided by this invention, and generally described with reference to block 1404 in Fig. 14. Shading is computed as if it were applied to an opacity isosurface passing through all voxels in a volume in an ambient and diffuse illumination model wherein the voxel volume is illuminated with one or more light sources (typically directional and bidirectional). The direction of the negative visible opacity gradient vector serves in this case as a normal to the isosurface. Special treatment is added for the voxels inside opaque areas of the volume based on specifics of volumetric geoscience data, which improves a user's perception of the rendered image. A lighted and shaded volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the volume.
As part of computing shading a decision is made if the volume being rendered is to be shaded as if lit by a uni-directional light source pointing in one direction or by bidirectional light consisting of two identical directional lights pointing in opposite directions on a volume. When the volume being rendered is lit by a uni-directional source the ambient and diffuse shading intensity is calculated using opacity d,ιaώng = I(β) ) as follows: if G L > 0
1 skating = I ambient + IdiffuΛG ' l)/ Nθmi(G ) else if G -L ≤ O shading ambient where G is the negative visible opacity gradient, calculated as previously described, and L is the unit light vector.
For bi-directional light consisting of two identical directional lights pointing in opposite directions on the volume, the ambient and diffuse shading intensity is calculated as follows: if G L > 0
I shading = J ambient + I diffuse [G ' Nθl'm(G ) else if G -L ≤ O
Shading = ! ambient ~ I diffuse {G • I)/ Nθmt(G) where G is the vector of the negative visible opacity gradient and L is the unit light vector. For the case of G = 0 (when and only when Norm(G) = 0 ) deserves special treatment. There exist four possible scenarios:
a) G = 0 and the opacity value in the voxel is 0.0 ( β(i,j,k) = 0 ). We treat this case as an empty space and set Islmding = 0.
b) G = 0 while the data value in the voxel is not 0.0 ( β(i, j,k) ≠ 0 ), data values of the surrounding voxels are not all 0.0, and
∑β(i + l,j,k) = ∑β(i,j + \,k) = ∑β(i,j,k + l)
J,k ι,k i,ι
In this case the voxel is treated as positioned within a homogeneous voxel body. Theoretically, in this case an isosurface that would pass through the voxel is not defined and, thus, the voxel should not be shaded ( Ishadin = 1 ). This produces visual artifacts when somebody observes the rendered image. In order to eliminate it, we choose to assign such voxels an artificial gradient Gprefereιl and set the shading intensity in such a voxel to be: shading ~ ambient "*~ * diffuse prefered ' )
For most of 3D geoscience volumes there are horizontal layers in the data so the most reasonable choice for G fered is:
" nrefered ~
Figure imgf000022_0001
c) G = 0 while the data value in the voxel is not 0.0 (β(i,j,k) ≠ O ), but data values of all the surrounding voxels are 0.0. In this case of a single voxel surrounded by empty space the direction of the visual opacity gradient in not defined. Thus, we arbitrary select it to satisfy the equations from scenario (b) immediately above. d) The rest of the scenarios might require re-computing G by using right differences or some other Icnown method. The required computations are time consuming. The shading computed using the formula in scenario (b) above for most voxels approximates the precisely computed ones reasonably well.
After the shading intensity has been computed, including the special treatment in scenarios (a) - (d), it is applied to a color-coded opacity volume. This was referred to with reference to block 1405 in Fig. 14. There are multiple ways known to implement this. Often, the color of a voxel is derived from the associated data value, μ , using a color palette, also called a look-up table or LUT, as:
(r,g,b,a)= LUT(μ) where LUT is essentially a one-dimensional array of r,g,b5α quadruples indexed by the data value, μ . Thus, whenever a color palette is used, the color of a voxel is a function of only the data value.
When we apply shading to a voxel, the final color of the voxel depends not only on the data value, but also on the shading intensity associated with the voxel. The initial formula calls for multiplication of each of the color components of each voxel by the shading intensity for the voxel. This requires many additional computations, slowing the process. One could alternatively use an extended color palette, SHADINGJLUT, which is a two dimensional matrix composed of r,g,b,α columns computed for different values of shading intensity, IshaιUng ■ Once pre-computed, such an extended color palette could be used to look-up a color of a shaded voxel using the data value, μ , and shading intensity, Ishading , as indexes:
{r,g,b, )= SHADING _LUT(μ,Islι ling) The initial coloring (r,g,b) of a 3D seismic opacity volume does not depend on the volume opacity and can be specified based on any data values associated with a volume (e.g., amplitude, instantaneous frequency, etc.). In order to simulate illumination of a voxel by a white light, the color components of voxel material are modified by multiplying them with the shading intensity:
V 'shaded ' 8 shaded ' "shaded ' a) ~ (* shading1" * shading 8 * shading "ι a)
The results obtained from this calculation are the rendered volumetric data that is used to display the 3D seismic information volume on a 2D display device.
While what has been described hereinabove is the preferred embodiment of the invention it will be appreciated by those skilled in the art that numerous changes may be made without departing from the spirit and scope of the invention.
What is claimed is:

Claims

1. A method for rendering a volume of voxel data with shading and opacity, wherein each voxel comprises a value representative of a parameter at a location within the volume and each voxel has an initial value of opacity, the method comprising the steps of: calculating a revised value of opacity for each of the voxels in the volume, the revised value of opacity of a voxel being dependent upon its initial value of opacity and the revised value of opacity of voxels proximate thereto; and calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels.
2. The volume rendering method of claim 1 wherein the step of calculating a revised value of opacity further comprises the step of: creating a cell of voxels surrounding each voxel in the volume, wherein all voxels in each cell are arranged into groups so that each voxel in each group of voxels within each cell are positioned one behind the other in a line parallel to the primary direction of light.
3. The volume rendering method of claim 2 wherein the step of calculating a revised value of opacity further comprises the steps of: setting the revised value of opacity of the voxel closest to the source of light in each group of voxels in each cell equal to its initial value of opacity; and setting the revised value of opacity of all other voxels in each group of voxels in each cell equal to the revised value of opacity of an adjacent voxel in the same group of voxels that is closer to the source of light if the revised value of opacity of the closer voxel is equal to or greater than the initial value of opacity of the adjacent other voxel, and setting the revised value of opacity of the adjacent other voxel equal to its initial value of opacity if the revised value of opacity of the closer voxel is less than the initial value of opacity of the adjacent other voxel.
4. The volume rendering method of claim 3 wherein the step of calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels further comprises the steps of: combining the revised values of opacity for the voxels in each cell to derive three orthogonal opacity gradient components for the voxel in the center of each cell; and combining the three orthogonal opacity gradient components for the voxel in the center of each cell to derive an opacity gradient that is normal to an isosurface passing through the voxel in the center of each cell.
5. The volume rendering method of claim 1 wherein the step of calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels further comprises the steps of: combining the revised values of opacity for the voxels in each cell to derive three orthogonal opacity gradient components for the voxel in the center of each cell; and combining the three orthogonal opacity gradient components for the voxel in the center of each cell to derive an opacity gradient that is normal to an isosurface passing through the voxel in the center of each cell.
6. The volume rendering method of claim 4 further comprising the steps of: calculating shading for the volume using the opacity gradient; and displaying the rendered volume on the display device.
7. The volume rendering method of claim 1 further comprising the steps of: calculating shading for the volume using the opacity gradient; and displaying the rendered volume on the display device.
8. A method for rendering a volume of voxel data with shading and opacity dependent upon the direction of a source of light specified to be illuminating the volume for rendering, wherein each voxel comprises a value representative of a parameter at a location within the volume and each voxel has an initial value of opacity, the method comprising the steps of: (1) calculating a revised value of opacity for a first voxel in the volume by creating a cell of voxels surrounding the first voxel, wherein all voxels in the cell are arranged into groups so that each voxel in each group of voxels within the cell are positioned one behind the other in a line parallel to the primary direction of light, the revised opacity calculation step further comprising the steps of: (a) setting the revised value of opacity of the voxel closest to the source of light in each group of voxels in the cell equal to its initial value of opacity; and (b) setting the revised value of opacity of other voxels in the groups of voxels in the cell equal to the revised value of opacity of an adjacent voxel in the same group of voxels that is closer to the source of light if the revised value of opacity of the closer voxel is equal to or greater than the initial value of opacity of the other voxel, and setting the revised value of opacity of the other voxel equal to the initial value of opacity of the other voxel if the revised value of opacity of the closer voxel is less than the initial value of opacity of the other voxel; (2) repeating the revised opacity calculations step for all other voxels in the volume other than the first voxel; and (3) calculating a gradient of opacity for each of the voxels in the volume using the calculated revised values of opacity for all voxels.
9. The volume rendering method of claim 8 wherein the step of calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels further comprises the steps of: combining the revised values of opacity for the voxels in each cell to derive three orthogonal opacity gradient components for the voxel in the center of the last mentioned cell; and combining the three orthogonal opacity gradient components for the voxel in the center of each cell to derive an opacity gradient that is normal to an isosurface passing through the voxel in the center of the last mentioned cell.
10. The volume rendering method of claim 8 further comprising the steps of: calculating shading for the volume using the opacity gradient; and displaying the rendered volume on the display device.
11. The volume rendering method of claim 9 further comprising the steps of: calculating shading for the volume using the opacity gradient; and displaying the rendered volume on the display device.
12. A method for rendering a volume of voxel data with shading and opacity dependent upon the direction of a source of light specified to be illuminating the volume for rendering, wherein each voxel comprises a value representative of a parameter at a location within the volume and each voxel has an initial value of opacity, the .method comprising the steps of: (1) calculating a revised value of opacity for a first voxel in the volume by creating a cell of voxels surrounding the first voxel, wherein all voxels in the cell are . arranged into, groups so that each voxel in each group of voxels within the cell are positioned one behind the other in a line parallel to the primary direction of light, the revised opacity calculation step further comprising the steps of: (a) setting the revised value of opacity of the voxel closest to the source of light in each group of voxels in the cell equal to its initial value of opacity; (b) setting the revised value of opacity of other voxels in the groups of voxels in the cell equal to the revised value of opacity of an adjacent voxel in the same group of voxels that is closer to the source of light if the revised value of opacity of the closer voxel is equal to or greater than the initial value of opacity of the other voxel, and setting the revised value of opacity of the other voxel equal to the initial value of opacity of the other voxel if the revised value of opacity of the closer voxel is less than the initial value of opacity of the other voxel; (2) repeating the revised opacity calculations step for all other voxels in the volume other than the first voxel; (3) calculating three orthogonal opacity gradient components for the first voxel, one of the orthogonal opacity gradient components being parallel to the primary direction of light, the opacity gradient calculation step further comprising the steps of: (a) combining the revised values of opacity for the voxels in the cell to derive the three orthogonal opacity gradient components for the first voxel; and (b) combining the three orthogonal opacity gradient components for the first voxel to derive an opacity gradient that is normal to an isosurface passing through the first voxel; (c) repeating the opacity gradient calculation step for all other voxels in the volume other than the first voxel.
13. The volume rendering method of claim 12 further comprising the steps of: calculating shading for the volume using the opacity gradient; and displaying the rendered volume on the display device.
14 Apparatus for rendering a volume of voxel data with shading and opacity, wherein each voxel comprises a value representative of a parameter at a location within the volume and each voxel has an initial value of opacity, the rendering apparatus comprising: means for calculating a revised value of opacity for each of the voxels in the volume, the revised value of opacity of a voxel being dependent upon its initial value of opacity and the revised value of opacity of voxels proximate thereto; and means for calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels.
15. The volume rendering apparatus of claim 14 wherein the means for calculating a revised value of opacity further comprises: means for creating a cell of voxels surrounding each voxel in the volume, wherein all voxels in each cell are arranged into groups so that each voxel in each group of voxels within each cell are positioned one behind the other in a line parallel to the primary direction of light.
16. The volume rendering apparatus of claim 15 wherein the means for calculating a revised value of opacity further comprises: means for setting the revised value of opacity of the voxel closest to the source of light in each group of voxels in each cell equal to its initial value of opacity; and means for setting the revised value of opacity of all other voxels in each group of voxels in each cell equal to the revised value of opacity of an adjacent voxel in the same group of voxels that is closer to the source of light if the revised value of opacity of the closer voxel is equal to or greater than the initial value of opacity of the adjacent other voxel, and setting the revised value of opacity of the adjacent other voxel equal to its initial value of opacity if the revised value of opacity of the closer voxel is less than the initial value of opacity of the adjacent other voxel.
17. The volume rendering apparatus of claim 16 wherein the means for calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels further comprises: means for combining the revised values of opacity for the voxels in each cell to derive three orthogonal opacity gradient components for the voxel in the center of each cell; and means for combining the three orthogonal opacity gradient components for the ' voxel in the center of each cell to derive an opacity gradient that is normal to an isosurface passing- through the voxel in the center of each cell
18. The volume rendering apparatus of claim 14 wherein the means for calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels further comprises: means for combining the revised values of opacity for the voxels in each cell to derive three orthogonal opacity gradient components for the voxel in the center of each cell; and means for combining the three orthogonal opacity gradient components for the voxel in the center of each cell to derive an opacity gradient that is normal to an isosurface passing through the voxel in the center of each cell.
19. The volume rendering apparatus of claim 17 further comprising: means for calculating shading for the volume using the opacity gradient; and means for displaying the rendered volume on a display device.
20. The volume rendering apparatus of claim 14 further comprising: means for calculating shading for the volume. using the opacity gradient; and means for displaying the rendered volume on a display device.
21. Apparatus for rendering a volume of voxel data with shading and opacity dependent upon the direction of a source of light specified to be illuminating the volume for rendering, wherein each voxel comprises a value representative of a parameter at a location within the volume and each voxel has an initial value of opacity, the apparatus comprising: (1) means for calculating a revised value of opacity for a first voxel in the volume by creating a cell of voxels surrounding the first voxel, wherein all voxels in the cell are arranged into groups so that each voxel in each group of voxels within the cell are positioned one behind the other in a line parallel to the primary direction of light, the revised opacity calculation means further comprising: (a) means for setting the revised value of opacity of the voxel closest to the source of light in each group of voxels in the cell equal to its initial value of opacity; and (b) means for setting the revised value of opacity of other voxels in the groups of voxels in the cell equal to the revised value of opacity of an adj acent voxel in the same group of voxels that is closer to the source of light if the revised value of opacity of the closer voxel is equal to or greater than the initial value of opacity of the other voxel, and setting the revised value of opacity of the other voxel equal to the initial value of opacity of the other voxel if the revised value of opacity of the closer voxel is less than the initial value of opacity of the other voxel; (2) means for repeating the revised opacity calculations for all other voxels in the volume other than the first voxel; and (3) means for calculating a gradient of opacity for each of the voxels in the volume using the calculated revised values of opacity for all voxels.
22. The volume rendering apparatus of claim 21 wherein the means for calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels further comprises: means for combining the revised values of opacity for the voxels in each cell to derive three orthogonal opacity gradient components for the voxel in the center of the last mentioned cell; and means for combining the three orthogonal opacity gradient components for the voxel in the center of each cell to derive an opacity gradient that is normal to an isosurface passing through the voxel in the center of the last mentioned cell.
23. The. volume rendering apparatus of claim 21 further comprising: means for calculating shading for the volume using the opacity gradient; and means for displaying the rendered volume on the display device.
24. The volume rendering apparatus of claim 22 further comprising: means for calculating shading for the volume using the opacity gradient; and means for displaying the rendered volume on the display device.
25. Apparatus for rendering a volume of voxel data with shading and opacity dependent upon the direction of a source of light specified to be illuminating the volume for rendering, wherein each voxel comprises a value representative of a parameter at a location within the volume and each voxel has an initial value of opacity, the apparatus comprising: (1) means for calculating a revised value of opacity for a first voxel in the volume by creating a cell of voxels surrounding the first voxel, wherein all voxels in the cell are arranged into groups so that each voxel in each group of voxels within the cell are positioned one behind the other in a line parallel to the primary direction of light, the revised opacity calculation means further comprising: (a) means for setting the revised value of opacity of the voxel closest to the source of light in each group of voxels in the cell equal to its initial value of opacity; (b) means for setting the revised value of opacity of other voxels in the groups of voxels in the cell equal to the revised value of opacity of an adjacent voxel in the same group of voxels that is closer to the source of light if the revised value of opacity of the closer voxel is equal to or greater than the initial value of opacity of the other voxel, and setting the revised value of opacity of the other voxel equal to the initial value of opacity of the other voxel if the revised value of opacity of the closer voxel is less than the initial value of opacity of the other voxel; (2) means for repeating the revised opacity calculations for all other voxels in the volume other than the first voxel; (3) means for calculating three orthogonal opacity gradient components for the first voxel, and one of the orthogonal opacity gradient components is parallel to the primary direction of light, the opacity gradient calculation means further comprising : (a) means for combining the revised values of opacity for the voxels in the cell to derive the three orthogonal opacity gradient components for the first voxel; (b) means for combining the three orthogonal opacity gradient components for the first voxel to derive an opacity gradient that is normal to an isosurface passing through the first voxel; and (c) means for repeating the opacity gradient calculations for all other voxels in the volume other than the first voxel; and (4) means for calculating shading for the volume using the opacity gradient.
26. The volume rendering apparatus of claim 12 further comprising: means for calculating shading for the volume using the opacity gradient; and means for displaying the rendered volume on the display device.
27. A computer readable medium containing executable instructions for rendering on a display device a volume of voxel data with shading and opacity dependent upon the direction of a source of light specified to be illuminating the volume for rendering, wherein each voxel comprises a value representative of a parameter at a location within the volume and each voxel has an initial value of opacity, the executable program instructions comprising program instructions for: (1) calculating a revised value of opacity for a first voxel in the volume by creating a cell of voxels surrounding the first voxel, wherein all voxels in the cell are arranged into groups so that each voxel in each group of voxels within the cell are positioned one behind the other in a' line parallel to the primary direction of light, the revised opacity calculation means further comprising: (a) setting the revised value of opacity of the voxel closest to the source of light in each group of voxels in the cell equal to its initial value of opacity; and (b) setting the revised value of opacity of other voxels in the groups of voxels in the cell equal to the revised value of opacity of an adjacent voxel in the same group of voxels that is closer to the source of light if the revised value of opacity of the closer voxel is equal to or greater than the initial value of opacity of the other voxel, and setting the revised value of opacity of the other voxel equal to the initial value of opacity of the other voxel if the revised value of opacity of the closer voxel is less than the initial value of opacity of the other voxel; (2) repeating the revised opacity calculations for all other voxels in the volume other than the first voxel; (3) calculating a gradient of opacity for each of the voxels in the volume using the calculated revised values of opacity for all voxels; and (4) calculating shading for the volume using the opacity gradient.
28. A computer readable medium containing executable instructions for rendering on a display device a volume of voxel data with shading and opacity, wherein each voxel comprises a value representative of a parameter at a location within the volume and each voxel has an initial value of opacity, the method comprising the steps of: calculating a revised value of opacity for each of the voxels in the volume, the revised value of opacity of a voxel being dependent upon its initial value of opacity and the revised value of opacity of voxels proximate thereto; calculating an opacity gradient for each of the voxels in the volume using the calculated revised values of opacity for all voxels; and calculating shading for the volume using the opacity gradient.
PCT/US2001/048439 2000-12-18 2001-12-14 Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading WO2002050779A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
GB0314041A GB2386811B (en) 2000-12-18 2001-12-14 Method and apparatus for visualization of 3D voxel data using lit opacity volumes with shading
AU2002236628A AU2002236628A1 (en) 2000-12-18 2001-12-14 Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading
CA002432090A CA2432090C (en) 2000-12-18 2001-12-14 Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading
NO20023903A NO324124B1 (en) 2000-12-18 2002-08-16 Method and apparatus for visualizing three-dimensional wax data using illuminated opacity volumes with shading

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US25643300P 2000-12-18 2000-12-18
US60/256,433 2000-12-18

Publications (1)

Publication Number Publication Date
WO2002050779A1 true WO2002050779A1 (en) 2002-06-27

Family

ID=22972213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/048439 WO2002050779A1 (en) 2000-12-18 2001-12-14 Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading

Country Status (7)

Country Link
US (1) US6940507B2 (en)
AU (1) AU2002236628A1 (en)
CA (1) CA2432090C (en)
GB (1) GB2386811B (en)
NO (1) NO324124B1 (en)
RU (1) RU2298831C9 (en)
WO (1) WO2002050779A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004047028A1 (en) * 2002-11-15 2004-06-03 Siemens Aktiengesellschaft Method for depicting an object displayed in a volume data set
US7515743B2 (en) * 2004-01-08 2009-04-07 Siemens Medical Solutions Usa, Inc. System and method for filtering a medical image
RU2464640C2 (en) * 2006-10-27 2012-10-20 Эрбюс Операсьон (Сас) Method and apparatus for providing simulation of three-dimensional objects

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050114831A1 (en) * 2001-04-18 2005-05-26 Andres Callegari Volume body renderer
CA2936413C (en) * 2001-04-18 2018-09-04 Landmark Graphics Corporation, A Halliburton Company Volume body renderer
NZ514119A (en) * 2001-09-11 2004-06-25 Deep Video Imaging Ltd Improvement to instrumentation
US7619585B2 (en) * 2001-11-09 2009-11-17 Puredepth Limited Depth fused display
NZ517713A (en) 2002-06-25 2005-03-24 Puredepth Ltd Enhanced viewing experience of a display through localised dynamic control of background lighting level
NZ521505A (en) 2002-09-20 2005-05-27 Deep Video Imaging Ltd Multi-view display
US7095409B2 (en) * 2003-04-30 2006-08-22 Pixar Shot shading method and apparatus
NZ525956A (en) 2003-05-16 2005-10-28 Deep Video Imaging Ltd Display control system for use with multi-layer displays
US7298376B2 (en) * 2003-07-28 2007-11-20 Landmark Graphics Corporation System and method for real-time co-rendering of multiple attributes
US7834890B2 (en) * 2003-10-17 2010-11-16 Canon Kabushiki Kaisha Information processing method and image processing method
US20050171700A1 (en) * 2004-01-30 2005-08-04 Chroma Energy, Inc. Device and system for calculating 3D seismic classification features and process for geoprospecting material seams
NZ542843A (en) * 2005-10-05 2008-08-29 Pure Depth Ltd Method of manipulating visibility of images on a volumetric display
US7671867B2 (en) * 2006-05-08 2010-03-02 Schlumberger Technology Corporation Method for locating underground deposits of hydrocarbon including a method for highlighting an object in a three dimensional scene
CN103185896B (en) 2006-09-01 2016-08-10 哈利伯顿兰德马克绘图公司 For wave shape body being carried out the system and method for imaging
US7627429B2 (en) * 2006-09-15 2009-12-01 Schlumberger Technology Corporation Method for producing underground deposits of hydrocarbon from an earth formation using fault interpretation including spline fault tracking
CA2674820C (en) * 2007-01-05 2020-01-21 Landmark Graphics Corporation, A Halliburton Company Systems and methods for selectively imaging objects in a display of multiple three-dimensional data-objects
EP2102823B8 (en) 2007-01-05 2016-06-29 Landmark Graphics Corporation Systems and methods for visualizing multiple volumetric data sets in real time
JP4588736B2 (en) * 2007-04-12 2010-12-01 富士フイルム株式会社 Image processing method, apparatus, and program
US8432411B2 (en) * 2007-05-18 2013-04-30 Pure Depth Limited Method and system for improving display quality of a multi-component display
US9171391B2 (en) 2007-07-27 2015-10-27 Landmark Graphics Corporation Systems and methods for imaging a volume-of-interest
US7702463B2 (en) 2007-12-12 2010-04-20 Landmark Graphics Corporation, A Halliburton Company Systems and methods for enhancing a seismic data image
US8255816B2 (en) * 2008-01-25 2012-08-28 Schlumberger Technology Corporation Modifying a magnified field model
JP4376944B2 (en) * 2008-04-03 2009-12-02 富士フイルム株式会社 Intermediate image generation method, apparatus, and program
CN102047294B (en) * 2008-06-06 2013-10-30 兰德马克绘图国际公司,哈里伯顿公司 Systems and methods for imaging three-dimensional volume of geometrically irregular grid data representing grid volume
US9524700B2 (en) * 2009-05-14 2016-12-20 Pure Depth Limited Method and system for displaying images of various formats on a single display
US8928682B2 (en) 2009-07-07 2015-01-06 Pure Depth Limited Method and system of processing images for improved display
AU2010347724B2 (en) 2010-03-12 2016-06-23 Exxonmobil Upstream Research Company Dynamic grouping of domain objects via smart groups
WO2011136861A1 (en) 2010-04-30 2011-11-03 Exxonmobil Upstream Research Company Method and system for finite volume simulation of flow
AU2011283190A1 (en) 2010-07-29 2013-02-07 Exxonmobil Upstream Research Company Methods and systems for machine-learning based simulation of flow
AU2011283196B2 (en) 2010-07-29 2014-07-31 Exxonmobil Upstream Research Company Method and system for reservoir modeling
AU2011283193B2 (en) 2010-07-29 2014-07-17 Exxonmobil Upstream Research Company Methods and systems for machine-learning based simulation of flow
US9036869B2 (en) 2010-08-31 2015-05-19 Zeta Instruments, Inc. Multi-surface optical 3D microscope
GB2502432B (en) 2010-09-20 2018-08-01 Exxonmobil Upstream Res Co Flexible and adaptive formulations for complex reservoir simulations
US9330490B2 (en) * 2011-04-29 2016-05-03 University Health Network Methods and systems for visualization of 3D parametric data during 2D imaging
WO2013039606A1 (en) 2011-09-15 2013-03-21 Exxonmobil Upstream Research Company Optimized matrix and vector operations in instruction limited algorithms that perform eos calculations
US9595129B2 (en) 2012-05-08 2017-03-14 Exxonmobil Upstream Research Company Canvas control for 3D data volume processing
CA2883169C (en) 2012-09-28 2021-06-15 Exxonmobil Upstream Research Company Fault removal in geological models
US10048396B2 (en) * 2013-03-14 2018-08-14 Exxonmobil Upstream Research Company Method for region delineation and optimal rendering transform of seismic attributes
US10088596B2 (en) * 2013-03-15 2018-10-02 Schlumberger Technology Corporation Meshless representation of a geologic environment
RU2533055C1 (en) * 2013-09-27 2014-11-20 Общество С Ограниченной Ответственностью "Биомедицинские Технологии" Method of optimising maximum intensity projection technique for rendering scalar three-dimensional data in static mode, in interactive mode and in real time
WO2016018723A1 (en) 2014-07-30 2016-02-04 Exxonmobil Upstream Research Company Method for volumetric grid generation in a domain with heterogeneous material properties
AU2015339883B2 (en) 2014-10-31 2018-03-29 Exxonmobil Upstream Research Company Methods to handle discontinuity in constructing design space for faulted subsurface model using moving least squares
US10803534B2 (en) 2014-10-31 2020-10-13 Exxonmobil Upstream Research Company Handling domain discontinuity with the help of grid optimization techniques
WO2016094483A1 (en) * 2014-12-09 2016-06-16 Schlumberger Canada Limited Visualization of vector fields using lights
HUE064459T2 (en) 2016-12-23 2024-03-28 Exxonmobil Technology & Engineering Company Method and system for stable and efficient reservoir simulation using stability proxies
US10782441B2 (en) * 2017-04-25 2020-09-22 Analogic Corporation Multiple three-dimensional (3-D) inspection renderings
WO2019221711A1 (en) * 2018-05-15 2019-11-21 Hewlett-Packard Development Company, L.P. Print property control
WO2020040729A1 (en) * 2018-08-20 2020-02-27 Hewlett-Packard Development Company, L.P. Generating a preview of a part to be printed
US11300680B2 (en) * 2019-04-02 2022-04-12 Raytheon Company Three-dimensional (3D) radar weather data rendering techniques
WO2020237089A1 (en) 2019-05-21 2020-11-26 Magic Leap, Inc. Caching and updating of dense 3d reconstruction data
WO2021108718A1 (en) * 2019-11-27 2021-06-03 Brainspec Inc. Systems and methods for analyzing and interfacing with combined imaging and spectroscopy data for the brain

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5381518A (en) * 1986-04-14 1995-01-10 Pixar Method and apparatus for imaging volume data using voxel values
US5442733A (en) * 1992-03-20 1995-08-15 The Research Foundation Of State University Of New York Method and apparatus for generating realistic images using a discrete representation
US5807488A (en) * 1997-02-19 1998-09-15 Metafix Inc. Exchangeable filter medium container and method of connecting and recycling such containers
US5831623A (en) * 1995-08-09 1998-11-03 Mitsubishi Denki Kabushiki Kaisha Volume rendering apparatus and method
US5847711A (en) * 1994-09-06 1998-12-08 The Research Foundation Of State University Of New York Apparatus and method for parallel and perspective real-time volume visualization
US5963211A (en) * 1995-06-29 1999-10-05 Hitachi, Ltd. Method and apparatus for directly generating three-dimensional images from voxel data with dividing image generating processes and utilizing parallel processes
US6008813A (en) * 1997-08-01 1999-12-28 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Real-time PC based volume rendering system
US6040835A (en) * 1997-11-06 2000-03-21 Mitsubishi Electric Information Technology Center America, Inl. (Ita) System for depicting surfaces using volumetric distance maps

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4835712A (en) 1986-04-14 1989-05-30 Pixar Methods and apparatus for imaging volume data with shading
JP2627607B2 (en) * 1993-06-16 1997-07-09 日本アイ・ビー・エム株式会社 Volume rendering method
JP3537594B2 (en) 1996-06-13 2004-06-14 アロカ株式会社 Ultrasonic diagnostic equipment
US5986662A (en) 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
US5986612A (en) * 1996-12-30 1999-11-16 General Motors Corporation Vehicle window antenna
US6278459B1 (en) * 1997-08-20 2001-08-21 Hewlett-Packard Company Opacity-weighted color interpolation for volume sampling
US6130671A (en) 1997-11-26 2000-10-10 Vital Images, Inc. Volume rendering lighting using dot product methodology

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5381518A (en) * 1986-04-14 1995-01-10 Pixar Method and apparatus for imaging volume data using voxel values
US5442733A (en) * 1992-03-20 1995-08-15 The Research Foundation Of State University Of New York Method and apparatus for generating realistic images using a discrete representation
US5847711A (en) * 1994-09-06 1998-12-08 The Research Foundation Of State University Of New York Apparatus and method for parallel and perspective real-time volume visualization
US5963211A (en) * 1995-06-29 1999-10-05 Hitachi, Ltd. Method and apparatus for directly generating three-dimensional images from voxel data with dividing image generating processes and utilizing parallel processes
US5831623A (en) * 1995-08-09 1998-11-03 Mitsubishi Denki Kabushiki Kaisha Volume rendering apparatus and method
US5807488A (en) * 1997-02-19 1998-09-15 Metafix Inc. Exchangeable filter medium container and method of connecting and recycling such containers
US6008813A (en) * 1997-08-01 1999-12-28 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Real-time PC based volume rendering system
US6040835A (en) * 1997-11-06 2000-03-21 Mitsubishi Electric Information Technology Center America, Inl. (Ita) System for depicting surfaces using volumetric distance maps

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004047028A1 (en) * 2002-11-15 2004-06-03 Siemens Aktiengesellschaft Method for depicting an object displayed in a volume data set
US7457816B2 (en) 2002-11-15 2008-11-25 Siemens Aktiengesellschaft Method for depicting an object displayed in a volume data set
US7515743B2 (en) * 2004-01-08 2009-04-07 Siemens Medical Solutions Usa, Inc. System and method for filtering a medical image
RU2464640C2 (en) * 2006-10-27 2012-10-20 Эрбюс Операсьон (Сас) Method and apparatus for providing simulation of three-dimensional objects

Also Published As

Publication number Publication date
RU2298831C2 (en) 2007-05-10
GB0314041D0 (en) 2003-07-23
NO324124B1 (en) 2007-08-27
NO20023903D0 (en) 2002-08-16
NO20023903L (en) 2002-10-18
GB2386811B (en) 2005-06-08
AU2002236628A1 (en) 2002-07-01
US20020109684A1 (en) 2002-08-15
CA2432090C (en) 2009-02-17
US6940507B2 (en) 2005-09-06
RU2298831C9 (en) 2007-08-10
CA2432090A1 (en) 2002-06-27
GB2386811A (en) 2003-09-24

Similar Documents

Publication Publication Date Title
CA2432090C (en) Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading
US5307450A (en) Z-subdivision for improved texture mapping
US6304266B1 (en) Method and apparatus for volume rendering
Wilhelms et al. A coherent projection approach for direct volume rendering
CA2933764C (en) System and method for real-time co-rendering of multiple attributes
US6191794B1 (en) Method and apparatus for scaling texture maps for graphical images
US7245300B2 (en) Architecture for real-time texture look-up&#39;s for volume rendering
EP0180296A2 (en) System and method for displaying and interactively excavating and examining a three dimensional volume
CN105096385B (en) A kind of two-dimension earthquake section 3 D displaying method
US20060203010A1 (en) Real-time rendering of embedded transparent geometry in volumes on commodity graphics processing units
CN102903141B (en) Many seismic properties based on opacity weighting merge texture mapping object plotting method
Goodsell et al. Rendering volumetric data in molecular systems
Chen et al. Manipulation, display, and analysis of three-dimensional biological images
US8797383B2 (en) Method for stereoscopic illustration
WO2002007088A2 (en) Apparatus and method for diffuse and specular lighting of voxel data
Plate et al. A flexible multi-volume shader framework for arbitrarily intersecting multi-resolution datasets
JP2003263651A (en) Volume rendering method and its program
US5793372A (en) Methods and apparatus for rapidly rendering photo-realistic surfaces on 3-dimensional wire frames automatically using user defined points
Chen et al. Representation, display, and manipulation of 3D digital scenes and their medical applications
US20070188492A1 (en) Architecture for real-time texture look-up&#39;s for volume rendering
Wan et al. Interactive stereoscopic rendering of volumetric environments
Razdan et al. Volume visualization of multicolor laser confocal microscope data
Leith Computer visualization of volume data in electron tomography
Dolgovesov et al. Real-Time Volume Rendering Systems.
Chen et al. Software And Hardware For 3-D Gray Level Image Analysis And Quantization

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

ENP Entry into the national phase

Ref document number: 0314041

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20011214

Format of ref document f/p: F

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2432090

Country of ref document: CA

ENP Entry into the national phase

Country of ref document: RU

Kind code of ref document: A

Format of ref document f/p: F

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP