WO2002050779A1 - Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading - Google Patents
Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading Download PDFInfo
- Publication number
- WO2002050779A1 WO2002050779A1 PCT/US2001/048439 US0148439W WO0250779A1 WO 2002050779 A1 WO2002050779 A1 WO 2002050779A1 US 0148439 W US0148439 W US 0148439W WO 0250779 A1 WO0250779 A1 WO 0250779A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- opacity
- voxel
- voxels
- volume
- revised
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V1/00—Seismology; Seismic or acoustic prospecting or detecting
- G01V1/28—Processing seismic data, e.g. analysis, for interpretation, for correction
- G01V1/34—Displaying seismic recordings or visualisation of seismic data or attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/503—Blending, e.g. for anti-aliasing
Definitions
- This invention relates generally to the field of computer graphics. Particularly, this invention relates to volume rendering. More particularly, this invention relates to the display of three-dimensional (3D) data on a two-dimensional (2D) display with shading and opacity to control the realistic display of volumetric data in voxel format.
- 3D three-dimensional
- 2D two-dimensional
- Volume rendering is an important area of computer graphics. It is employed in a wide variety of disciplines, including medicine, geology, biology and meteorology. Volume rendering allows a user to look inside an object and see features that were otherwise shielded by the rendering of the surface features of the object.
- One patent teaching volume rendering using voxels is U.S. Patent No. 6,304,266, issued October 16, 2001, entitled “Method and Apparatus For Volume Rendering” which is incorporated herein by reference.
- volumetric data is shown as consisting of a three-dimensional (3D) dataset of elements called "voxels" 102.
- the voxels 102 are uniformly distributed throughout a volume 104.
- Each voxel 102 has a position in the volume and has associated with it information such as color, illumination, opacity, velocity, amplitude, etc.
- the information associated with each voxel 102 is produced by such disciplines as medicine (e.g., CAT scans), biology (confocal microscopy), and geoscience (seismic data).
- Fig. 2 is shown how data values of voxels 102 are typically stored in a storage array 202.
- the position of a particular voxel in the volume is inherent in its location in the array.
- array position 204 might be associated with a point 106 (Fig. 1) in the volume that is a specified distance from a specified corner of the volume.
- a single value is stored in the array 202 for each voxel 102, although it is also possible to store more than one value for each voxel 102, such as for color, illumination, opacity, velocity and amplitude.
- Figs. 3, 4, and 5 show 2D texture rendering subdividing volumetric data into slices.
- 2D texture rendering organizes the slices into three sets of slices 302, 402 and 502 along three different orthogonal axes.
- the voxels are partitioned among the sets of slices 302, 402 and 502 and into cells containing multiple voxels in each slice. The partitioning is done based on the position of the voxels in array 202.
- 3D texture rendering typically slices the volume perpendicular to the viewing direction.
- the rendering is then accomplished on a slice-by-slice basis, moving from the rear-most slice 304, 404 and 504, respectively, to the front-most slice 306, 406 and 506 respectively.
- the set of slices that is chosen and processed is the set whose axis makes the smallest angle to the viewing direction. While a new image is rendered it is blended with the previously drawn scene, creating perception of a 3D body.
- 2D texture rendering organizes the slices along one of the three volume dimensions, while 3D texture rendering slices the volume perpendicular to the viewing direction, which improves image quality, but requires interpolation between the volume data points. Such interpolation is usually performed by specialized graphics hardware. Figs.
- the texels are stored in a data buffer 602 (block 706).
- the texel value is an indication of the RGB colors (red, green & blue) to be displayed for a voxel as determined by one or more parameters dependent on the data value or values associated with the voxel and is found in a look-up table.
- the texel data may include a value for each of the red, green, and blue (RGB) components associated with the voxel.
- Fig. 8 shows a display device 802 upon which information downloaded with the texel data is displayed. Based on that information and the perspective requested by the user, the display device maps the texels onto pixels on a display screen 804 (block 710). As each slice is downloaded and rendered, the user sees the volume in the requested perspective. Each time the user changes the view, for example by using a software tool to rotate, translate or magnify the image of the volume, the process of downloading and rendering slices is repeated.
- Fig. 9 is illustrated the display of a volume that shows the outside surfaces of the volume. The interior of the volume is not seen.
- semi-transparent data is created by adding an additional factor, alpha ( ⁇ ), to each voxel along with the RGB (red, green & blue) components described above.
- alpha alpha
- the value of alpha of a voxel determines the opacity of the voxel.
- Opacity is a measure of the amount a particular voxel on a slice will allow a voxel on a background slice that maps to the same pixel on a 2D display to show through.
- the opacity of a voxel controls how the image of the voxel is blended with the images of the voxels behind it in the view being displayed.
- An opacity value of 0.0 means a voxel is completely transparent and cannot be seen so has no effect on the color of the displayed pixel on the 2D display since it is considered to be empty; and a value of 1.0 means the voxel is completely opaque, may be considered solid and, if it has no other voxels mapped in front of it, its texel determines the color of the displayed pixel.
- Intermediate opacity values correspond to intermediate levels of opacity, and the texel defined colors of two voxels mapped to the same pixel are mixed in conventional ways to determine the color of the pixel that will be displayed.
- Fig 10 is illustrated an opacity tool, such as the one included in the GEOVIZ product from Schlumberger-GeoQuest, the assignee of the present invention.
- the opacity tool is used to map vol ⁇ metric data, such as geophysical seismic interpretation, magnetic imaging, and ultra-sonography data, to see semi-transparent volumetric data therein.
- vol ⁇ metric data such as geophysical seismic interpretation, magnetic imaging, and ultra-sonography data
- the value of each voxel is not only mapped to a color defined by its texel but also with an opacity defined by alpha ( ).
- Fig. 11 is shown a display that results when the data displayed in Fig. 9 is processed using the opacity tool shown in Fig. 10.
- the surface of the volume no longer obscures structures inside the volume as is evident when comparing Figs. 9 and 11.
- the invention is concerned with improving the visual quality of images produced by rendering volumetric data in voxel format for the display of three-dimensional (3D) data on a two-dimensional (2D) display with' shading and opacity to control the realistic display of images rendered from the voxels.
- One of the main shortcomings of standard opacity volume rendering of the voxel format data volumes described in the Background of the Invention is that objects displayed using a voxel data volume appear flat, which inhibits depth perception and makes it hard, if not impossible, to determine the 3D shape, orientation, and relative positions of the objects.
- Figure 12 shows the display of an elliptical object rendered using standard opacity volume rendering of volumetric data consisting of voxels.
- Figure 13 shows how an elliptical set of voxels will appear when displayed as a set of solid "bricks" lit with a light. The actual shape of the rendered object can now be seen.
- the invention illuminates a voxel volume with one or more light sources.
- the lighted volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the voxel volume.
- Lighting parameters are computed, and graphical attributes of individual voxels are adjusted based on the orientation of opacity isosurfaces passing through each voxel in the volume.
- the opacity of displayed voxels of a data volume may be adjusted depending upon the opacity of nearby voxels.
- each voxel in a volume is assigned a standard opacity value ⁇ using a 3-dimentional opacity function a(ij,k) where alpha represents opacity and the letters i,j,k represent orthogonal directions.
- Isosurfaces connect voxels with equal opacity values. For each voxel, the algorithm estimates a normal to the isosurface passing though the voxel, which is equal to the negative gradient of the opacity function at the voxel, and uses it to shade the voxel as if it were a point on the isosurface. Though results are typically shown for a single unidirectional white light source, the results can be easily extended to bi-directional light sources or colored lights.
- Fig. 14 are shown the basic steps implemented in the rendering process that includes the teaching of the present invention, and these steps are generally described immediately below.
- the invention is implemented only in the second and third steps, in blocks 1402, 1403 and 1404.
- the initial opacity value of each voxel in the input voxel data volume is processed with an opacity tool, such shown in Fig. 10, to make most of the data transparent when displayed, as previously described, and the structures inside the volume are no longer obscured as can be seen in Fig. 11.
- the initial opacity value for each visible voxel in the data volume is converted to a standard opacity ( ⁇ ) value.
- the standard opacity value ⁇ for each visible voxel is converted to a new "visible opacity" ⁇ value in accordance with the teaching of the present invention.
- a new visible gradient value is calculated for every visible voxel in a volume using the new visible opacity ⁇ values in accordance with the teaching of the present invention.
- the new visible gradient value calculated for each voxel accounts for degenerate cases in numeric gradient computation and are used in all further rendering computations. Only visible cells, selected using the opacity tool and having a new visible opacity ⁇ calculated, have their opacity ⁇ value used to compute a new visible gradient, G , for each voxel.
- the new visible gradient, G for each voxel are used to shade and render the displayed voxels.
- ambient and diffuse shading is computed that would be applied to an opacity isosurface passing through the voxels in an ambient and diffuse illumination model wherein the voxel volume is illuminated with one or more light sources (typically directional and bi-directional).
- the direction of the negative visible gradient vector G serves as a normal to the isosurface.
- Special treatment is added for voxels inside opaque areas of the volume based on specifics of volumetric geoscience data, which improves a user's perception of the rendered image.
- a lighted and shaded volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the volume.
- the colors of all visible voxels in the volume are modified by applying the shading computed in block 1404.
- a computer is used to process voxel volume data, and the invention comprises an article of manufacture comprising a medium that is readable by the computer and the medium carries instructions for the computer to perform the novel process of calculating the new visible opacity and the visible gradient for each voxel as described in the previous paragraphs.
- Fig. 1 is a representation of voxels within a cubic volume
- Fig. 2 is a representation of the storage of voxel values in a memory
- Figs. 3-5 illustrate the partitioning of a volume into slices in orthogonal planes for 2D texture rendering
- Fig. 6 is a representation of a volume rendered image without shading
- Fig 7 is a block diagram of the steps in a volume rendering process
- Fig 8 is a block diagram of volume rendering equipment
- Fig 9 is a rendering of an opaque volume
- Fig 10 is a representation of an opacity tool
- Fig 1 1 is a rendering of a semi-transparent volume after an opaque volume is processed using an opacity tool
- Fig. 12 shows the display of an elliptical object rendered using standard opacity volume rendering of volumetric data consisting of voxels without shading
- Fig 13 shows how an elliptical set of voxels would appear when displayed as a set of solid "bricks" lit with a light;
- Fig. 14 shows in block diagram form the steps of processing and displaying volumetric geoscience data consisting of a volume of voxel data;
- Figs. 15 - 17 are exploded views of volumes each showing a cell made up of voxels that is used to calculate new visual opacity and visual gradient values for voxels;
- Figs. 18- 20 are tables showing how the new visual opacity values for voxels are calculated.
- Figs. 21 - 23 are tables showing how the new visual opacity values are used to calculate new visual gradient values for voxels.
- Volumetric data volume is first partitioned into slices as illustrated in Figs. 3, 4 and 5, and the slices each contain rows and columns of voxels, not seen in these Figures but seen in Figures 15 - 17.
- Volume rendering of the data may be speeded by reducing the number of voxels that are processed, downloaded and rendered by eliminating transparent voxels. Such volume rendering is taught in the previously mentioned U.S. Patent No. 6,304,266 that is incorporated herein by reference.
- Figs. 15 - 17 are shown a representative volume of voxels formed into a plurality of slices A - E and cells, as particularly applied to geoscience data, voxels are grouped into cells containing no fewer than twenty-seven alpha-numeric designated voxels (eg. Al 1, C21) and forming a cube. Other undesignated voxels shown in these Figures are other voxels of the data volume in which the cells being described are located. Only one cell is shown in each of these Figures for simplicity, but in reality there would be many slices and many cells in each slice of a volume. These cells are described in greater detail hereinafter.
- alpha-numeric designated voxels eg. Al 1, C21
- the slices are shown separated from each other only for ease in seeing the twenty-seven voxels Al 1-A33, Bl 1-B33 and Cl 1-C33 that make up cells one and three in Figs. 16 and 17 respectively, and the twenty-seven voxels Bl 1-B33, Cl 1-C33 and Dl 1-D33 that make up cell two in Fig. 16.
- Figs. 15, 16 and 17 only horizontal slices are specifically shown in Figs. 15, 16 and 17 but there are also vertical slices, and the intersecting orthogonal slices through the data volume are used to create the three dimensional rows and columns of voxels, as shown, so that, in this particular case, the vector / arrow direction of the light source is parallel to one of the three orthogonal axes (u,v,w) of the sliced data volume while the other two axes are orthogonal to the direction of the light source vector.
- the light vector is parallel to axes W
- Fig. 17 the light vector is parallel to axes V.
- the .method employed supports arbitrary directions for the light source.
- the value of initial opacity, ⁇ , for the voxels in each cell are used to calculate the values of visible opacity ⁇ only for the voxels in each cell as described hereinafter with reference to Figs. 18 - 20.
- a different cell partitioning size might be appropriate for other data, such as medical or meteorological data.
- An example of another cell size may be one-hundred, twenty-five voxels forming a cube of 5x5x5 voxels. The number of voxels in a cell is based on the distribution of data in the volume.
- Fig. 15 cell one the particular voxel of interest is the center voxel B22. In Fig. 15 light is incident on the top of the data volume and cell one and is represented by the light source vector / arrow.
- Fig. 16 cell two the particular voxel of interest is the center voxel C22. In Fig. 16 light is incident on the top of the data volume and cell two and is represented by the light source vector / arrow.
- cell three is made up of the same voxels as cell one, but light is incident on the right side of the data volume and cell three as represented by the light source vector / arrow.
- the initial opacity value for each voxel in a data volume is first mapped with an opacity tool, such as the one illustrated in Fig. 10 and that is included in the GEOVIZ product of Schlumberger-GeoQuest, to obtain a "standard opacity" value ⁇ for each voxel in a data volume.
- voxel Al 1 is semi-transparent (initial opacity value between 0.0 and 1.0) the opacity value of voxels Bl 1 and Cl 1 behind voxel Al 1 cannot be any less.
- voxel Al 1 is semi-transparent with an opacity value of 0.7
- voxels Bl 1 and Cl 1 cannot have a visible opacity value any lower than 0.7 and are changed accordingly in calculating visible opacity values ⁇ for voxels Bl 1 and Cl 1.
- Figs. 18 - 20 are tables used to simplify and more clearly show the mathematical calculations performed by the above equations to derive the new visible opacity values ⁇ , respectively, for all twenty-seven voxels in each of cells one, two and three using the "standard opacity" value ⁇ of the voxels in accordance with the teaching of the present invention.
- " ⁇ " indicates standard opacity and when used as ⁇ Al 1 indicates the standard opacity of voxel Al l
- ⁇ indicates the new visible opacity and when used as ⁇ Al 1 indicates the visible opacity of voxel Al 1.
- Fig. 15 Although only one cell is shown in Fig. 15 there are in reality many cells in a data volume and the visible opacity ⁇ is determined for all voxels in the data volume. This is done by having each voxel at the center of a cell and performing the calculations described above. The voxels on the outer surface of the volume can be disregarded.
- Fig. 19 is a table showing how standard opacity ⁇ values of each of the voxels in cell two (Fig. 16) are used to obtain the new visible opacity ⁇ values for each of the voxels in cell two.
- Fig. 16 Although only one cell is shown in Fig. 16 there are in reality many cells in a data volume and the visible opacity ⁇ is determined for all voxels in the data volume. This is done by having each voxel at the center of a cell and performing the calculations described above. The voxels on the outer surface of the volume can be disregarded.
- Fig. 17 is cell three that has the same voxels as cell one shown as in Fig. 15 but the direction of light on the data volume and cell three is from the right side, rather than from the top. Accordingly, calculations for visible opacity ⁇ are identical in manner but are different.
- the columns of three voxels are on their side through cell three. For example, voxels B23, B22, B21 and voxels C13, C12, Cl 1.
- the tables for calculating visible opacity ⁇ for all voxels in cell two are shown in Fig. 20. In view of the previous description of how these calculations are made with reference to the tables in Figs. 18 and 19, the description is not repeated here for the sake of brevity.
- the visible opacity ⁇ of all the voxels in cell three are used to calculate the visible opacity gradient G only for voxel B22 in the center of cell three as described hereinafter with reference to Fig. 23. Again, there are many cells in the data volume in Fig. 17 and visible opacity is determined for all voxels.
- Figs. 21 - 23 are tables used to describe the mathematical calculations performed to derive the three gradient components G u , G v , and G w that define a new visible opacity gradient G for only the voxel in the center of each of representative cells one, two and three in accordance with the teaching of the present invention.
- the gradients must be derived for all voxels in a data volume so similar tables are derived for the other voxels in a data volume, not just cells one, two and three.
- the three gradient components are calculated for every voxel using the newly calculated value of visible opacity ⁇ for all voxels in each cell. For cell one in Fig. 15 the center voxel is B22; for cell two in Fig. 16 the center voxel is C22; and for cell three in Fig. 17 the center voxel is B22.
- the new visible opacity gradient G for all voxels are then used to render the voxel data volume in a manner well known in the art. These calculations are repeated for each voxel in a ⁇ volume and are described in greater detail hereinafter.
- the new visible opacity ⁇ values calculated for each of the twenty-seven voxels Bl 1 through D33 in cell two are used in the equations in Fig. 19 to calculate gradient components G n , G v andG,,, of the visible opacity gradient vector G only for voxel B22 in the middle of cell two.
- the gradient components are then combined to get the negative visible opacity gradient for voxel B22.
- the visible opacity ⁇ values for the twenty-seven voxels Al 1 through C33 in cell three are used in the equations in Fig. 20 to calculate vector components G U ,G V and(z, constituting of the visible opacity gradient vector G only for voxel C22.
- a negative opacity gradient, G , at a particular voxel is determined by three partial derivatives along the three major axes, G n , G v , and G w as:
- G v ⁇ a(i,j + l,k)- ⁇ a(i,j-l,k) i,l ⁇ i,l ⁇
- G w ⁇ (i,j,k + l)- ⁇ c (i,j,k -l) j ⁇ ,J
- these three standard gradient equations for calculating vector components G u , G v and G w are modified to calculate a new negative "visible opacity gradient" vector G , shown in the following the equations, by using the new visible opacity ⁇ values, rather than standard opacity values ⁇ shown in the equations immediately above. Substituting a value of visible opacity ⁇ in the standard gradient equations, the gradient equations then become:
- G v Y j ⁇ (i,JA-l,k)- j ⁇ (i,j-l,k) i,k i,k
- G w ⁇ (i,j,k + l) - ⁇ (i,j,k -l) where ⁇ (ij, k) are the visual opacity of a single voxel along each its three major axes i,j,k.
- G (-G ,-G v ,-G w ) where the vector components G u , G v andG,,, are calculated using the tables in Figs. 21 -
- the new visible opacity gradient, G for each of voxels B22 (cell one), C22 (cell two) and B22 (cell three) are then used to calculate ambient and diffuse shading intensity ' for those voxels in a manner previously Icnown, with the addition of special treatments provided by this invention, and generally described with reference to block 1404 in Fig. 14.
- Shading is computed as if it were applied to an opacity isosurface passing through all voxels in a volume in an ambient and diffuse illumination model wherein the voxel volume is illuminated with one or more light sources (typically directional and bidirectional).
- the direction of the negative visible opacity gradient vector serves in this case as a normal to the isosurface.
- a lighted and shaded volume offers the viewer ample visual information to aid in the perception of depth, as well as the shapes, orientations, and positions of objects in the volume.
- the ambient and diffuse shading intensity is calculated as follows: if G L > 0
- I shading J ambient + I diffuse [ G ' N ⁇ l'm(G ) else if G -L ⁇ O
- the voxel is treated as positioned within a homogeneous voxel body.
- I shadin 1
- the shading intensity After the shading intensity has been computed, including the special treatment in scenarios (a) - (d), it is applied to a color-coded opacity volume. This was referred to with reference to block 1405 in Fig. 14. There are multiple ways known to implement this. Often, the color of a voxel is derived from the associated data value, ⁇ , using a color palette, also called a look-up table or LUT, as:
- (r,g,b,a) LUT( ⁇ ) where LUT is essentially a one-dimensional array of r,g,b 5 ⁇ quadruples indexed by the data value, ⁇ .
- LUT is essentially a one-dimensional array of r,g,b 5 ⁇ quadruples indexed by the data value, ⁇ .
- the final color of the voxel depends not only on the data value, but also on the shading intensity associated with the voxel.
- the initial formula calls for multiplication of each of the color components of each voxel by the shading intensity for the voxel. This requires many additional computations, slowing the process.
- an extended color palette SHADINGJLUT, which is a two dimensional matrix composed of r,g,b, ⁇ columns computed for different values of shading intensity, I sha ⁇ Ung ⁇
- an extended color palette could be used to look-up a color of a shaded voxel using the data value, ⁇ , and shading intensity, I shading , as indexes:
- ⁇ r,g,b, ) SHADING _LUT( ⁇ ,I sl ⁇ ling )
- the initial coloring (r,g,b) of a 3D seismic opacity volume does not depend on the volume opacity and can be specified based on any data values associated with a volume (e.g., amplitude, instantaneous frequency, etc.).
- the color components of voxel material are modified by multiplying them with the shading intensity:
- the results obtained from this calculation are the rendered volumetric data that is used to display the 3D seismic information volume on a 2D display device.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Computer Graphics (AREA)
- Acoustics & Sound (AREA)
- Environmental & Geological Engineering (AREA)
- Geology (AREA)
- General Life Sciences & Earth Sciences (AREA)
- Geophysics (AREA)
- Image Generation (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0314041A GB2386811B (en) | 2000-12-18 | 2001-12-14 | Method and apparatus for visualization of 3D voxel data using lit opacity volumes with shading |
AU2002236628A AU2002236628A1 (en) | 2000-12-18 | 2001-12-14 | Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading |
CA002432090A CA2432090C (en) | 2000-12-18 | 2001-12-14 | Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading |
NO20023903A NO324124B1 (en) | 2000-12-18 | 2002-08-16 | Method and apparatus for visualizing three-dimensional wax data using illuminated opacity volumes with shading |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US25643300P | 2000-12-18 | 2000-12-18 | |
US60/256,433 | 2000-12-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2002050779A1 true WO2002050779A1 (en) | 2002-06-27 |
Family
ID=22972213
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2001/048439 WO2002050779A1 (en) | 2000-12-18 | 2001-12-14 | Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading |
Country Status (7)
Country | Link |
---|---|
US (1) | US6940507B2 (en) |
AU (1) | AU2002236628A1 (en) |
CA (1) | CA2432090C (en) |
GB (1) | GB2386811B (en) |
NO (1) | NO324124B1 (en) |
RU (1) | RU2298831C9 (en) |
WO (1) | WO2002050779A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004047028A1 (en) * | 2002-11-15 | 2004-06-03 | Siemens Aktiengesellschaft | Method for depicting an object displayed in a volume data set |
US7515743B2 (en) * | 2004-01-08 | 2009-04-07 | Siemens Medical Solutions Usa, Inc. | System and method for filtering a medical image |
RU2464640C2 (en) * | 2006-10-27 | 2012-10-20 | Эрбюс Операсьон (Сас) | Method and apparatus for providing simulation of three-dimensional objects |
Families Citing this family (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050114831A1 (en) * | 2001-04-18 | 2005-05-26 | Andres Callegari | Volume body renderer |
CA2936413C (en) * | 2001-04-18 | 2018-09-04 | Landmark Graphics Corporation, A Halliburton Company | Volume body renderer |
NZ514119A (en) * | 2001-09-11 | 2004-06-25 | Deep Video Imaging Ltd | Improvement to instrumentation |
US7619585B2 (en) * | 2001-11-09 | 2009-11-17 | Puredepth Limited | Depth fused display |
NZ517713A (en) | 2002-06-25 | 2005-03-24 | Puredepth Ltd | Enhanced viewing experience of a display through localised dynamic control of background lighting level |
NZ521505A (en) | 2002-09-20 | 2005-05-27 | Deep Video Imaging Ltd | Multi-view display |
US7095409B2 (en) * | 2003-04-30 | 2006-08-22 | Pixar | Shot shading method and apparatus |
NZ525956A (en) | 2003-05-16 | 2005-10-28 | Deep Video Imaging Ltd | Display control system for use with multi-layer displays |
US7298376B2 (en) * | 2003-07-28 | 2007-11-20 | Landmark Graphics Corporation | System and method for real-time co-rendering of multiple attributes |
US7834890B2 (en) * | 2003-10-17 | 2010-11-16 | Canon Kabushiki Kaisha | Information processing method and image processing method |
US20050171700A1 (en) * | 2004-01-30 | 2005-08-04 | Chroma Energy, Inc. | Device and system for calculating 3D seismic classification features and process for geoprospecting material seams |
NZ542843A (en) * | 2005-10-05 | 2008-08-29 | Pure Depth Ltd | Method of manipulating visibility of images on a volumetric display |
US7671867B2 (en) * | 2006-05-08 | 2010-03-02 | Schlumberger Technology Corporation | Method for locating underground deposits of hydrocarbon including a method for highlighting an object in a three dimensional scene |
CN103185896B (en) | 2006-09-01 | 2016-08-10 | 哈利伯顿兰德马克绘图公司 | For wave shape body being carried out the system and method for imaging |
US7627429B2 (en) * | 2006-09-15 | 2009-12-01 | Schlumberger Technology Corporation | Method for producing underground deposits of hydrocarbon from an earth formation using fault interpretation including spline fault tracking |
CA2674820C (en) * | 2007-01-05 | 2020-01-21 | Landmark Graphics Corporation, A Halliburton Company | Systems and methods for selectively imaging objects in a display of multiple three-dimensional data-objects |
EP2102823B8 (en) | 2007-01-05 | 2016-06-29 | Landmark Graphics Corporation | Systems and methods for visualizing multiple volumetric data sets in real time |
JP4588736B2 (en) * | 2007-04-12 | 2010-12-01 | 富士フイルム株式会社 | Image processing method, apparatus, and program |
US8432411B2 (en) * | 2007-05-18 | 2013-04-30 | Pure Depth Limited | Method and system for improving display quality of a multi-component display |
US9171391B2 (en) | 2007-07-27 | 2015-10-27 | Landmark Graphics Corporation | Systems and methods for imaging a volume-of-interest |
US7702463B2 (en) | 2007-12-12 | 2010-04-20 | Landmark Graphics Corporation, A Halliburton Company | Systems and methods for enhancing a seismic data image |
US8255816B2 (en) * | 2008-01-25 | 2012-08-28 | Schlumberger Technology Corporation | Modifying a magnified field model |
JP4376944B2 (en) * | 2008-04-03 | 2009-12-02 | 富士フイルム株式会社 | Intermediate image generation method, apparatus, and program |
CN102047294B (en) * | 2008-06-06 | 2013-10-30 | 兰德马克绘图国际公司,哈里伯顿公司 | Systems and methods for imaging three-dimensional volume of geometrically irregular grid data representing grid volume |
US9524700B2 (en) * | 2009-05-14 | 2016-12-20 | Pure Depth Limited | Method and system for displaying images of various formats on a single display |
US8928682B2 (en) | 2009-07-07 | 2015-01-06 | Pure Depth Limited | Method and system of processing images for improved display |
AU2010347724B2 (en) | 2010-03-12 | 2016-06-23 | Exxonmobil Upstream Research Company | Dynamic grouping of domain objects via smart groups |
WO2011136861A1 (en) | 2010-04-30 | 2011-11-03 | Exxonmobil Upstream Research Company | Method and system for finite volume simulation of flow |
AU2011283190A1 (en) | 2010-07-29 | 2013-02-07 | Exxonmobil Upstream Research Company | Methods and systems for machine-learning based simulation of flow |
AU2011283196B2 (en) | 2010-07-29 | 2014-07-31 | Exxonmobil Upstream Research Company | Method and system for reservoir modeling |
AU2011283193B2 (en) | 2010-07-29 | 2014-07-17 | Exxonmobil Upstream Research Company | Methods and systems for machine-learning based simulation of flow |
US9036869B2 (en) | 2010-08-31 | 2015-05-19 | Zeta Instruments, Inc. | Multi-surface optical 3D microscope |
GB2502432B (en) | 2010-09-20 | 2018-08-01 | Exxonmobil Upstream Res Co | Flexible and adaptive formulations for complex reservoir simulations |
US9330490B2 (en) * | 2011-04-29 | 2016-05-03 | University Health Network | Methods and systems for visualization of 3D parametric data during 2D imaging |
WO2013039606A1 (en) | 2011-09-15 | 2013-03-21 | Exxonmobil Upstream Research Company | Optimized matrix and vector operations in instruction limited algorithms that perform eos calculations |
US9595129B2 (en) | 2012-05-08 | 2017-03-14 | Exxonmobil Upstream Research Company | Canvas control for 3D data volume processing |
CA2883169C (en) | 2012-09-28 | 2021-06-15 | Exxonmobil Upstream Research Company | Fault removal in geological models |
US10048396B2 (en) * | 2013-03-14 | 2018-08-14 | Exxonmobil Upstream Research Company | Method for region delineation and optimal rendering transform of seismic attributes |
US10088596B2 (en) * | 2013-03-15 | 2018-10-02 | Schlumberger Technology Corporation | Meshless representation of a geologic environment |
RU2533055C1 (en) * | 2013-09-27 | 2014-11-20 | Общество С Ограниченной Ответственностью "Биомедицинские Технологии" | Method of optimising maximum intensity projection technique for rendering scalar three-dimensional data in static mode, in interactive mode and in real time |
WO2016018723A1 (en) | 2014-07-30 | 2016-02-04 | Exxonmobil Upstream Research Company | Method for volumetric grid generation in a domain with heterogeneous material properties |
AU2015339883B2 (en) | 2014-10-31 | 2018-03-29 | Exxonmobil Upstream Research Company | Methods to handle discontinuity in constructing design space for faulted subsurface model using moving least squares |
US10803534B2 (en) | 2014-10-31 | 2020-10-13 | Exxonmobil Upstream Research Company | Handling domain discontinuity with the help of grid optimization techniques |
WO2016094483A1 (en) * | 2014-12-09 | 2016-06-16 | Schlumberger Canada Limited | Visualization of vector fields using lights |
HUE064459T2 (en) | 2016-12-23 | 2024-03-28 | Exxonmobil Technology & Engineering Company | Method and system for stable and efficient reservoir simulation using stability proxies |
US10782441B2 (en) * | 2017-04-25 | 2020-09-22 | Analogic Corporation | Multiple three-dimensional (3-D) inspection renderings |
WO2019221711A1 (en) * | 2018-05-15 | 2019-11-21 | Hewlett-Packard Development Company, L.P. | Print property control |
WO2020040729A1 (en) * | 2018-08-20 | 2020-02-27 | Hewlett-Packard Development Company, L.P. | Generating a preview of a part to be printed |
US11300680B2 (en) * | 2019-04-02 | 2022-04-12 | Raytheon Company | Three-dimensional (3D) radar weather data rendering techniques |
WO2020237089A1 (en) | 2019-05-21 | 2020-11-26 | Magic Leap, Inc. | Caching and updating of dense 3d reconstruction data |
WO2021108718A1 (en) * | 2019-11-27 | 2021-06-03 | Brainspec Inc. | Systems and methods for analyzing and interfacing with combined imaging and spectroscopy data for the brain |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5381518A (en) * | 1986-04-14 | 1995-01-10 | Pixar | Method and apparatus for imaging volume data using voxel values |
US5442733A (en) * | 1992-03-20 | 1995-08-15 | The Research Foundation Of State University Of New York | Method and apparatus for generating realistic images using a discrete representation |
US5807488A (en) * | 1997-02-19 | 1998-09-15 | Metafix Inc. | Exchangeable filter medium container and method of connecting and recycling such containers |
US5831623A (en) * | 1995-08-09 | 1998-11-03 | Mitsubishi Denki Kabushiki Kaisha | Volume rendering apparatus and method |
US5847711A (en) * | 1994-09-06 | 1998-12-08 | The Research Foundation Of State University Of New York | Apparatus and method for parallel and perspective real-time volume visualization |
US5963211A (en) * | 1995-06-29 | 1999-10-05 | Hitachi, Ltd. | Method and apparatus for directly generating three-dimensional images from voxel data with dividing image generating processes and utilizing parallel processes |
US6008813A (en) * | 1997-08-01 | 1999-12-28 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | Real-time PC based volume rendering system |
US6040835A (en) * | 1997-11-06 | 2000-03-21 | Mitsubishi Electric Information Technology Center America, Inl. (Ita) | System for depicting surfaces using volumetric distance maps |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4835712A (en) | 1986-04-14 | 1989-05-30 | Pixar | Methods and apparatus for imaging volume data with shading |
JP2627607B2 (en) * | 1993-06-16 | 1997-07-09 | 日本アイ・ビー・エム株式会社 | Volume rendering method |
JP3537594B2 (en) | 1996-06-13 | 2004-06-14 | アロカ株式会社 | Ultrasonic diagnostic equipment |
US5986662A (en) | 1996-10-16 | 1999-11-16 | Vital Images, Inc. | Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging |
US5986612A (en) * | 1996-12-30 | 1999-11-16 | General Motors Corporation | Vehicle window antenna |
US6278459B1 (en) * | 1997-08-20 | 2001-08-21 | Hewlett-Packard Company | Opacity-weighted color interpolation for volume sampling |
US6130671A (en) | 1997-11-26 | 2000-10-10 | Vital Images, Inc. | Volume rendering lighting using dot product methodology |
-
2001
- 2001-12-14 WO PCT/US2001/048439 patent/WO2002050779A1/en not_active Application Discontinuation
- 2001-12-14 CA CA002432090A patent/CA2432090C/en not_active Expired - Fee Related
- 2001-12-14 RU RU2003117723/09A patent/RU2298831C9/en not_active IP Right Cessation
- 2001-12-14 US US10/017,560 patent/US6940507B2/en not_active Expired - Fee Related
- 2001-12-14 GB GB0314041A patent/GB2386811B/en not_active Expired - Fee Related
- 2001-12-14 AU AU2002236628A patent/AU2002236628A1/en not_active Abandoned
-
2002
- 2002-08-16 NO NO20023903A patent/NO324124B1/en not_active IP Right Cessation
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5381518A (en) * | 1986-04-14 | 1995-01-10 | Pixar | Method and apparatus for imaging volume data using voxel values |
US5442733A (en) * | 1992-03-20 | 1995-08-15 | The Research Foundation Of State University Of New York | Method and apparatus for generating realistic images using a discrete representation |
US5847711A (en) * | 1994-09-06 | 1998-12-08 | The Research Foundation Of State University Of New York | Apparatus and method for parallel and perspective real-time volume visualization |
US5963211A (en) * | 1995-06-29 | 1999-10-05 | Hitachi, Ltd. | Method and apparatus for directly generating three-dimensional images from voxel data with dividing image generating processes and utilizing parallel processes |
US5831623A (en) * | 1995-08-09 | 1998-11-03 | Mitsubishi Denki Kabushiki Kaisha | Volume rendering apparatus and method |
US5807488A (en) * | 1997-02-19 | 1998-09-15 | Metafix Inc. | Exchangeable filter medium container and method of connecting and recycling such containers |
US6008813A (en) * | 1997-08-01 | 1999-12-28 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | Real-time PC based volume rendering system |
US6040835A (en) * | 1997-11-06 | 2000-03-21 | Mitsubishi Electric Information Technology Center America, Inl. (Ita) | System for depicting surfaces using volumetric distance maps |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004047028A1 (en) * | 2002-11-15 | 2004-06-03 | Siemens Aktiengesellschaft | Method for depicting an object displayed in a volume data set |
US7457816B2 (en) | 2002-11-15 | 2008-11-25 | Siemens Aktiengesellschaft | Method for depicting an object displayed in a volume data set |
US7515743B2 (en) * | 2004-01-08 | 2009-04-07 | Siemens Medical Solutions Usa, Inc. | System and method for filtering a medical image |
RU2464640C2 (en) * | 2006-10-27 | 2012-10-20 | Эрбюс Операсьон (Сас) | Method and apparatus for providing simulation of three-dimensional objects |
Also Published As
Publication number | Publication date |
---|---|
RU2298831C2 (en) | 2007-05-10 |
GB0314041D0 (en) | 2003-07-23 |
NO324124B1 (en) | 2007-08-27 |
NO20023903D0 (en) | 2002-08-16 |
NO20023903L (en) | 2002-10-18 |
GB2386811B (en) | 2005-06-08 |
AU2002236628A1 (en) | 2002-07-01 |
US20020109684A1 (en) | 2002-08-15 |
CA2432090C (en) | 2009-02-17 |
US6940507B2 (en) | 2005-09-06 |
RU2298831C9 (en) | 2007-08-10 |
CA2432090A1 (en) | 2002-06-27 |
GB2386811A (en) | 2003-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2432090C (en) | Method and apparatus for visualization of 3d voxel data using lit opacity volumes with shading | |
US5307450A (en) | Z-subdivision for improved texture mapping | |
US6304266B1 (en) | Method and apparatus for volume rendering | |
Wilhelms et al. | A coherent projection approach for direct volume rendering | |
CA2933764C (en) | System and method for real-time co-rendering of multiple attributes | |
US6191794B1 (en) | Method and apparatus for scaling texture maps for graphical images | |
US7245300B2 (en) | Architecture for real-time texture look-up's for volume rendering | |
EP0180296A2 (en) | System and method for displaying and interactively excavating and examining a three dimensional volume | |
CN105096385B (en) | A kind of two-dimension earthquake section 3 D displaying method | |
US20060203010A1 (en) | Real-time rendering of embedded transparent geometry in volumes on commodity graphics processing units | |
CN102903141B (en) | Many seismic properties based on opacity weighting merge texture mapping object plotting method | |
Goodsell et al. | Rendering volumetric data in molecular systems | |
Chen et al. | Manipulation, display, and analysis of three-dimensional biological images | |
US8797383B2 (en) | Method for stereoscopic illustration | |
WO2002007088A2 (en) | Apparatus and method for diffuse and specular lighting of voxel data | |
Plate et al. | A flexible multi-volume shader framework for arbitrarily intersecting multi-resolution datasets | |
JP2003263651A (en) | Volume rendering method and its program | |
US5793372A (en) | Methods and apparatus for rapidly rendering photo-realistic surfaces on 3-dimensional wire frames automatically using user defined points | |
Chen et al. | Representation, display, and manipulation of 3D digital scenes and their medical applications | |
US20070188492A1 (en) | Architecture for real-time texture look-up's for volume rendering | |
Wan et al. | Interactive stereoscopic rendering of volumetric environments | |
Razdan et al. | Volume visualization of multicolor laser confocal microscope data | |
Leith | Computer visualization of volume data in electron tomography | |
Dolgovesov et al. | Real-Time Volume Rendering Systems. | |
Chen et al. | Software And Hardware For 3-D Gray Level Image Analysis And Quantization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
ENP | Entry into the national phase |
Ref document number: 0314041 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20011214 Format of ref document f/p: F |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2432090 Country of ref document: CA |
|
ENP | Entry into the national phase |
Country of ref document: RU Kind code of ref document: A Format of ref document f/p: F |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: JP |