US20050088456A1 - System and method for run-time integration of an inset geometry into a background geometry - Google Patents

System and method for run-time integration of an inset geometry into a background geometry Download PDF

Info

Publication number
US20050088456A1
US20050088456A1 US10/979,892 US97989204A US2005088456A1 US 20050088456 A1 US20050088456 A1 US 20050088456A1 US 97989204 A US97989204 A US 97989204A US 2005088456 A1 US2005088456 A1 US 2005088456A1
Authority
US
United States
Prior art keywords
geometry
inset
skirt
background
perimeter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/979,892
Inventor
Michael Cosman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Evans and Sutherland Computer Corp
Original Assignee
Evans and Sutherland Computer Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Evans and Sutherland Computer Corp filed Critical Evans and Sutherland Computer Corp
Priority to US10/979,892 priority Critical patent/US20050088456A1/en
Assigned to EVANS & SUTHERLAND COMPUTER CORPORATION reassignment EVANS & SUTHERLAND COMPUTER CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COSMAN, MICHAEL A.
Publication of US20050088456A1 publication Critical patent/US20050088456A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification

Definitions

  • the present invention relates generally to computer graphics. More particularly, the present invention relates to run-time integration of an inset geometry into a background geometry.
  • Computers have been used for many years to do image and graphics generation. In recent years computer generated graphics have become more sophisticated and the power of computer equipment has increased. Similarly, users' expectations of computer graphics have also increased. Computer users have come to expect more realism in computer graphics which generally means that there are more objects, and more light and texture processing on those objects.
  • a host processor 20 is provided to process a display model or database.
  • the host processor is connected to a geometry subsystem 22 , which transforms object coordinate polygon data to the world coordinate system.
  • the geometry system can also take care of lighting, viewing transformation and mapping to screen coordinates.
  • the rasterization subsystem 24 converts transformed primitives to pixels and subpixels. Rasterization includes scan conversion, visible-surface determination and shading.
  • Each pixel and/or subpixel is typically assigned an X and Y coordinate, a RGBA (i.e., Red, Green, Blue, Alpha) color value and a Z-value.
  • the pixels are stored in a frame buffer 26 and then output to a display 28 .
  • the geometry subsystem This is where the world model is processed and the transformation of the model will take place.
  • the world model that is supplied to the geometry subsystem is fixed at run-time and the entire database that represents the scene geometry is compiled in advance. Up to this point in time, models or databases have only practically been modifiable at compile time and any insertion to the system model has been a compile time operation that involves reconstructing the model. This process is time consuming and can take anywhere from an hour up to several hours.
  • Such a simulation system may often include a cab that is a vehicle mock-up containing a crew compartment with vehicle instruments and controls.
  • the cab can be mounted on a motion base to provide motion and acceleration cues by moving the cab.
  • the motion base is coupled to a visual system, which provides out-the-window imagery and environmental data for the crew, host, or both.
  • a software system called the host oversees the operation of the simulator.
  • the host monitors the control inputs provided by the crew, and causes the cockpit dials, instruments and displays to reflect the ongoing simulation status.
  • the host controls the motion base and related audio systems, and tells the visual system what it needs to know to draw the corresponding out-the-window scene.
  • a real-time system is a software program within the visual system that controls the image generator in response to host inputs.
  • the host tells the real-time system about object positions in the simulated environment (e.g., own aircraft, traffic aircraft, ground traffic, storms, etc.), the status of switchable or selectable items (e.g., runway and environmental lights, runway contamination, etc.), and position of global environmental effects like illumination (e.g., day, dusk, night) and visibility (e.g., fog, rain, snow, etc.).
  • the real-time system returns data such as the nature of the surface beneath the tires of the aircraft, and whether collisions have occurred between the aircraft and other traffic or storm cells. This communication is largely asynchronous which means it occurs randomly as needed and is not locked to the ongoing computation of regular image frames.
  • a simulation system can also contain many different types of scene elements such as terrain, aerials or tree canopies, linear features (roads, hedgerows, fences), and point features (trees, power poles, houses, light points).
  • scene elements such as terrain, aerials or tree canopies, linear features (roads, hedgerows, fences), and point features (trees, power poles, houses, light points).
  • Other models can be included in the system such as moving models of airplanes, cars, and helicopters, or environmental models such as clouds, sky, storms, or lightning flashes, etc.
  • the real-time system gets the required scene data from disk storage and loads it into the appropriate parts of the image generator in an on-going background process called paging. It also sends commands to the image generator to implement lighting, environmental, and other special effects called for by the host.
  • the real-time system determines the proper level-of-detail (LOD) for scene elements and prepares them for rendering after eliminating elements that will not appear in the scene. This process includes the translations and rotations needed to get scene elements into their proper position within the scene.
  • LOD level-of-detail
  • the real-time system controls the geometry engine and provides the input needed to allow the scene to be viewed and transformed. Further, the real-time system also manages the rendering portion of the image generator in a synchronous, lock-step fashion that guarantees a steady stream of video to the displays.
  • the invention provides a method for integrating an inset geometry within a background geometry.
  • the method comprises the step of identifying a perimeter of the inset geometry.
  • a further step is extending a skirt, having an outer perimeter and an inner perimeter, from the perimeter of the inset geometry out over the background geometry.
  • An additional step is removing portions of the background geometry that are covered by the inset geometry and skirt.
  • Another step is modifying the skirt so that the outer perimeter of the skirt matches background geometry behavior and the inner perimeter matches inset geometry behavior and a continuous transition exists between the outer perimeter and the inner perimeter.
  • FIG. 1 is a block diagram of a computer graphics system
  • FIG. 2 illustrates a cross-sectional side-view of geometry to be selected within the background geometry in an embodiment of the present invention
  • FIG. 3 illustrates a side view of the formation of an opening or hole where the inset geometry is to be inserted in the background geometry
  • FIG. 4 illustrates the insertion of the inset geometry within the background geometry as embodied according to the present invention.
  • FIG. 5 depicts the skirt mapping and coordination between the background geometry and the inset geometry as embodied in the present invention.
  • FIG. 6 depicts a flow diagram of a method for incorporating an inset geometry into a background geometry according to an embodiment of the present invention
  • FIG. 7 illustrates a cutter polygon overlaid on a background geometry
  • FIG. 8 illustrates a skirt surrounding an inset geometry as overlaid on the previous background geometry
  • FIG. 9 illustrates a skirt surrounding an inset geometry that is divided into clipped skirt fragments
  • FIG. 10 depicts a cross-section view of the polygon skirt and inset terrain and their accompanying walls that are beneath the visible surface of the model.
  • the present invention discloses a video graphics display system that is able to merge a smaller inset geometry at run time into a larger background geometry.
  • the inset geometry can replace a section of the background geometry and a gradual transition from the geometric behavior of the larger section to the geometric behavior of the smaller section is achieved in an advantageous way.
  • the transition allows a composite geometry created from the two separate geometries to be visually continuous and to avoid cracks between the two geometries during display.
  • the relative positions of the inset geometry and larger background geometry need not be known until run time in the present invention.
  • skirt comprises a set of polygons that extend from the perimeter of the inset geometry outward a given distance.
  • the distance of the skirt between the inset geometry and the background geometry need not be constant in all directions. Skirt polygons have vertices that are completely and exactly coincident with the inset geometry along the perimeter of the inset.
  • FIGS. 2-5 An example of an insert being made into background geometry is shown in FIGS. 2-5 .
  • the term background geometry is generally used here to mean a larger geometry or a global geometry within which a smaller geometry can be inset or inserted. In terms of the present invention, this means a smaller piece of inset geometry can be inserted into a larger piece of background geometry in real time without requiring a recompilation of the entire geometry model database.
  • an airport, army base, city, or any group of buildings can be inserted into a larger terrain.
  • a crater or destroyed building portion can be inserted into the geometry of a building, dam, or group of buildings.
  • the background geometry can also be described as the primary geometry or global geometry.
  • FIGS. 2-5 illustrate the process of incorporating a new inset geometry within a background geometry.
  • a terrain 100 may be the background geometry.
  • the background geometry might be a building, dam, or mountain that incorporates a geometric displacement (e.g. by an explosion) and requires new inset geometry.
  • Another situation that might require dynamic inset geometry would be a changing geometry, such as bulldozer making cuts to the landscape in real time.
  • the inset geometry can be the portion of the geometry inside the airport fence. For more complex images or models that involve insets on cuts and fills, some additional surrounding terrain may be included.
  • the model with a portion removed 108 is shown in FIG. 3 and it requires additional geometry and control information.
  • One or more polygons are used to define the footprint of the inset geometry.
  • the footprint is used to define an opening cutter 104 ( FIG. 2 ) used to cut a hole or opening 106 in the background geometry 102 or global geometry.
  • the opening cutter can define an area of a terrain hole to accommodate an inset geometry 110 , as in FIG. 4 .
  • the cutting can be implemented by a hierarchical suppression of geometry facets based on whether they are completely surrounded by the hole cutter 104 or implemented in other ways known to those skilled in the art.
  • polygon facets that are not completely surrounded by the cutter are divided into polygon fragments by the cutter boundary so that all the background polygons will be either in or out of the cutter boundary.
  • a skirt of polygons 114 around the perimeter of the inset geometry is created so the outside vertices conform to the background geometry and the inside vertices are coincident with the inset geometry 110 , as shown in FIG. 5 .
  • the outside skirt vertices are shared with the vertices of the hole cutter 104 .
  • the skirt 114 is conformed to the background around the skirt's outside perimeter, and the skirt is conformed by clipping it to the underlying background polygons. This creates additional vertices along the inside edge of the skirt, where the skirt joins the inset geometry. These additional vertices lie geometrically along the original, unclipped inside edges of the skirt. Other, additional vertices created during the clipping operation that lie between the inside and outside edges of the skirt get a blend of behavior between the inside and outside skirt vertices.
  • FIG. 6 illustrates a flow chart depicting operations performed in an embodiment of the invention.
  • a skirt is constructed around the perimeter of the inset geometry in block 200 .
  • the skirt polygons extend outward from the inset geometry a pre-selected distance that is not necessarily constant in all directions. This skirt shares vertices completely and exactly with the inset geometry along the perimeter of the inset. The combination of the inset geometry with the skirt creates an extended inset.
  • the portion of the background geometry overlaid by the extended inset is removed. This is done by constructing a cutter polygon or polygons from the outside perimeter vertices of the skirt in block 202 .
  • the cutter polygon 212 is illustrated in FIG. 7 as it overlays the background geometry 214 .
  • the cutter polygon is clipped into a contiguous mesh of cutter fragments 216 , where each fragment is wholly and exactly contained within a background polygon of the background geometry that it overlays and is geometrically coplanar with its associated background polygon.
  • the initial cutter polygon is not necessarily in a plane and it will be divided into polygons or triangles to match the actual background geometry.
  • the cutter polygon may be applied by rendering it into the pixel frame buffer as a stencil mask to remove background polygons and portions of background polygons covered by the cutter.
  • the cutter polygon 212 there is a “projection direction” of the cutter polygon 212 onto the background polygons 214 .
  • the cutter polygons are projected vertically onto the background geometry and clipped against vertical clip planes defined by the edges of the underlying background polygons.
  • FIG. 8 illustrates that the application of the cutter polygon 212 generates a hole or opening in the background geometry that is precisely the shape of the geometry of the inset geometry 222 and the skirt 220 to be inserted.
  • the size of the geometry removed by the cutter polygon is greater than the inset geometry to allow for the skirt. This means that the inset geometry does not need to know the details about the background geometry and the skirt can act as a geometric blend region.
  • the inset geometry does not need to be clipped because the skirt will be matched to the inset geometry along their shared perimeter. Avoiding clipping the inset geometry reduces the complexity of inserting any given inset geometry into the background geometry. In this process, neither the background nor the inset geometry are modified which is a valuable advantage of the present invention.
  • the hole cutting technique is designed to utilize the largest possible cutter polygons in order to maximize the ability to detect and discard background geometry facets that are wholly within the region of interest. If multiple cutter polygons are utilized, it is preferable (although not required) that they form a contiguous tiling of the area to be removed or suppressed, with complete vertex sharing.
  • the system need not be able to discern all geometry facets that can be completely removed should they be covered by the collective effects of several cutter polygons. As such, it is more effective when the system utilizes a single convex cutter polygon, even if it requires a large number of vertices. It is more efficient to discard a completely covered background polygon than to render it (along with an associated cutter polygon) and then erase it pixel-by-pixel with a stencil operation.
  • FIG. 9 illustrates that the skirt polygons are clipped against the background polygons they overlay in block 204 ( FIG. 6 ).
  • This generates a set of skirt fragments 230 , each of which is entirely contained within one background polygon.
  • the skirt fragments will be equal in size or smaller than the background polygons in which they are contained and the points at which the background polygons intersect the skirt 232 help define where the fragments will be divided.
  • the set of clipped skirt fragments will be greater in number than the original set of skirt fragments.
  • updated vertex coordinates 232 in the projection direction are computed for the clipped skirt fragment polygons as in 206 ( FIG. 6 ).
  • the computed component values can be a blend of at least one background attribute and at least one inset attribute at the skirt vertex, where these attributes might include geometry, shading, texture and color data.
  • the blend factor can be based on the relative lateral location of each fragment vertex in the skirt.
  • vertices on the outside edge of the skirt receive 100% background behavior
  • vertices on the inside edge of the skirt receive 100% inset behavior
  • vertices in between get an appropriate blend of the two.
  • the clipped and conformed skirt thus becomes an integration or transition region between background behavior and inset behavior.
  • FIG. 10 illustrates that a first drop wall 252 of polygons beneath the visible surface of the model is constructed ( 208 in FIG. 6 ) after the new vertex coordinates are computed.
  • the top vertices 260 of the drop wall polygons are shared with the original perimeter vertices of the inset geometry.
  • the bottom vertices of these polygons are shifted outward slightly and their visible sides face outward from the inset geometry.
  • the wall polygons are below the nominal “surface” which means that the drop wall is not visible to the end user.
  • the next step is creating a second drop wall of polygons in 210 .
  • a second drop wall of polygons 256 is then constructed and its top vertices are shared with the clipped skirt fragment vertices 258 along the perimeter. The bottom vertices of these polygons are shifted slightly inward and their visible sides face inward. At this point, the system then renders all the resultant polygons.
  • the clipped skirt polygons share vertices exactly with the cutter polygons along the outside perimeter, preventing visual cracks along the boundary between the background geometry and the skirt.
  • the first set of drop wall polygons shares vertices exactly with the inside edge of the skirt, and a second set of wall polygons shares vertices exactly with the outside edge of the original inset. Since the walls intersect geometrically (due to the shift of their bottom vertices), any potential visual cracks along the boundary between the original inset and the inside edge of the skirt are geometrically closed off, even if the coincident points 258 , 260 are momentarily not coincident. The result is the visually seamless insertion of one area of geometry inside another without any visual anomalies or cracks.
  • the skirt and drop wall polygons derived from the inset will also have a variable level-of-detail.
  • the wall polygons can be constructed so that they are just large enough to cover any potential crack. In general, this means that the height of each wall polygon is computed to subtend about a pixel. This process takes into account the distance of each original inset perimeter vertex from the eye, and computes a vertical drop for the corresponding wall bottom vertex that will (after perspective transformation) cause it to be about one pixel below, so that the wall polygon at that same position is about one pixel high. This minimizes the pixel fill-rate cost of the unseen wall polygons.
  • the cutter polygon or polygons form a hole in the background geometry that exactly accommodates the geometry of the inset geometry. If the underlying areas are not properly concealed, defects are shown during the actual display of the image. As mentioned, these defects can cause visual problems for users who view the final output. These defects include shifts in the polygon boundaries that expose light or the color of any backdrop layer, i.e. a “crack” that lets whatever is behind show through erroneously.
  • the defects can also provide discontinuities that are affected by the virtual instrumentation utilized within the simulation environment.
  • an infrared sensor may detect visual errors as a temperature difference between the desired geometry and the cracks, which can significantly distort the function and information provided to the user during the simulation. This can result in serious errors in performance and training. Further, bright spots may occur that are aesthetically unappealing during the training exercise and actually become distracting to the students utilizing the simulation environment. Since the process of generating images is often done at a rate of 60+ images per second, as the geometry is shifted during the simulation exercise, flickering may occur that distracts the student as well as provides false information to the virtual instrumentation utilized within a simulation system.
  • the approach of the present invention utilizes vertical walls of polygons constructed along the inner edge of the skirt, which can extend several meters below ground within the global terrain. Since the walls are vertical, they are easily constructed in an automatic fashion from the vertices defining the inner edge of the clipped skirt. To complement this wall, a modeled in place, sloping, below-ground wall is generated as part of the inset geometry. This structure is not clipped to the terrain, but provides a region of double coverage that is large enough to close off any cracks completely. Further, these “below ground” structures will not frequently be seen during the display operation, so they are simplified by omitting texture or shading finesse.
  • an airport terrain insert may be desired at a specific point in a global geometry.
  • the present invention allows the airport to be inserted without modifying or recompiling the global geometry.
  • the local detail must be matched up along the cut or skirt boundary and divided into versions that can be manipulated to match along the boundary. This matching becomes more difficult if the airport position must be adjusted at run time in order to get each of the several runway thresholds at the right place.
  • the present invention can overcome these problems using a skirt to allow the airport insert geometry to be adjusted slightly to allow the boundaries to match-up without creating visual anomalies.

Abstract

The invention provides a method for integrating an inset geometry within a background geometry. The method comprises the step of identifying a perimeter of the inset geometry. A further step is extending a skirt, having an outer perimeter and an inner perimeter, from the perimeter of the inset geometry out over the background geometry. An additional step is removing portions of the background geometry that are covered by the inset geometry and skirt. Another step is modifying the skirt so that the outer perimeter of the skirt matches background geometry behavior and the inner perimeter matches inset geometry behavior and a continuous transition exists between the outer perimeter and the inner perimeter.

Description

    FIELD OF THE INVENTION
  • The present invention relates generally to computer graphics. More particularly, the present invention relates to run-time integration of an inset geometry into a background geometry.
  • BACKGROUND
  • Computers have been used for many years to do image and graphics generation. In recent years computer generated graphics have become more sophisticated and the power of computer equipment has increased. Similarly, users' expectations of computer graphics have also increased. Computer users have come to expect more realism in computer graphics which generally means that there are more objects, and more light and texture processing on those objects.
  • Complex images and scenes are mathematically modeled in a three-dimensional space in the computer memory and manipulated accordingly. These three-dimensional mathematical models are called wire frames because all the edges of the object are visible at the same time when displayed. Three-dimensional models are made to look more realistic by removing the edges which should be hidden and by applying color and shading to the visible surfaces of the model. Texture also improves a simple polygon model by adding opacity and color variations.
  • In order to provide a better understanding of computer graphics architecture, a generalized computer graphics system will now be discussed. In FIG. 1, a host processor 20 is provided to process a display model or database. The host processor is connected to a geometry subsystem 22, which transforms object coordinate polygon data to the world coordinate system. The geometry system can also take care of lighting, viewing transformation and mapping to screen coordinates. The rasterization subsystem 24 converts transformed primitives to pixels and subpixels. Rasterization includes scan conversion, visible-surface determination and shading. Each pixel and/or subpixel is typically assigned an X and Y coordinate, a RGBA (i.e., Red, Green, Blue, Alpha) color value and a Z-value. The pixels are stored in a frame buffer 26 and then output to a display 28.
  • One element of a computer graphics system that is particularly relevant to the present discussion is the geometry subsystem. This is where the world model is processed and the transformation of the model will take place. Typically, the world model that is supplied to the geometry subsystem is fixed at run-time and the entire database that represents the scene geometry is compiled in advance. Up to this point in time, models or databases have only practically been modifiable at compile time and any insertion to the system model has been a compile time operation that involves reconstructing the model. This process is time consuming and can take anywhere from an hour up to several hours.
  • An example of a computer graphics application that has used compiled modeling techniques is high performance vehicle simulation. Such a simulation system may often include a cab that is a vehicle mock-up containing a crew compartment with vehicle instruments and controls. The cab can be mounted on a motion base to provide motion and acceleration cues by moving the cab. The motion base is coupled to a visual system, which provides out-the-window imagery and environmental data for the crew, host, or both.
  • A software system called the host oversees the operation of the simulator. The host monitors the control inputs provided by the crew, and causes the cockpit dials, instruments and displays to reflect the ongoing simulation status. In addition, the host controls the motion base and related audio systems, and tells the visual system what it needs to know to draw the corresponding out-the-window scene. A real-time system is a software program within the visual system that controls the image generator in response to host inputs.
  • The host tells the real-time system about object positions in the simulated environment (e.g., own aircraft, traffic aircraft, ground traffic, storms, etc.), the status of switchable or selectable items (e.g., runway and environmental lights, runway contamination, etc.), and position of global environmental effects like illumination (e.g., day, dusk, night) and visibility (e.g., fog, rain, snow, etc.). The real-time system returns data such as the nature of the surface beneath the tires of the aircraft, and whether collisions have occurred between the aircraft and other traffic or storm cells. This communication is largely asynchronous which means it occurs randomly as needed and is not locked to the ongoing computation of regular image frames. A simulation system can also contain many different types of scene elements such as terrain, aerials or tree canopies, linear features (roads, hedgerows, fences), and point features (trees, power poles, houses, light points). Other models can be included in the system such as moving models of airplanes, cars, and helicopters, or environmental models such as clouds, sky, storms, or lightning flashes, etc.
  • The real-time system gets the required scene data from disk storage and loads it into the appropriate parts of the image generator in an on-going background process called paging. It also sends commands to the image generator to implement lighting, environmental, and other special effects called for by the host. The real-time system determines the proper level-of-detail (LOD) for scene elements and prepares them for rendering after eliminating elements that will not appear in the scene. This process includes the translations and rotations needed to get scene elements into their proper position within the scene. In other words, the real-time system controls the geometry engine and provides the input needed to allow the scene to be viewed and transformed. Further, the real-time system also manages the rendering portion of the image generator in a synchronous, lock-step fashion that guarantees a steady stream of video to the displays.
  • SUMMARY OF THE INVENTION
  • The invention provides a method for integrating an inset geometry within a background geometry. The method comprises the step of identifying a perimeter of the inset geometry. A further step is extending a skirt, having an outer perimeter and an inner perimeter, from the perimeter of the inset geometry out over the background geometry. An additional step is removing portions of the background geometry that are covered by the inset geometry and skirt. Another step is modifying the skirt so that the outer perimeter of the skirt matches background geometry behavior and the inner perimeter matches inset geometry behavior and a continuous transition exists between the outer perimeter and the inner perimeter.
  • Additional features and advantages of the invention will be apparent from the detailed description which follows, taken in conjunction with the accompanying drawings, which together illustrate, by way of example, features of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a computer graphics system;
  • FIG. 2 illustrates a cross-sectional side-view of geometry to be selected within the background geometry in an embodiment of the present invention;
  • FIG. 3 illustrates a side view of the formation of an opening or hole where the inset geometry is to be inserted in the background geometry;
  • FIG. 4 illustrates the insertion of the inset geometry within the background geometry as embodied according to the present invention; and
  • FIG. 5 depicts the skirt mapping and coordination between the background geometry and the inset geometry as embodied in the present invention.
  • FIG. 6 depicts a flow diagram of a method for incorporating an inset geometry into a background geometry according to an embodiment of the present invention;
  • FIG. 7 illustrates a cutter polygon overlaid on a background geometry;
  • FIG. 8 illustrates a skirt surrounding an inset geometry as overlaid on the previous background geometry; and
  • FIG. 9 illustrates a skirt surrounding an inset geometry that is divided into clipped skirt fragments;
  • FIG. 10 depicts a cross-section view of the polygon skirt and inset terrain and their accompanying walls that are beneath the visible surface of the model.
  • DETAILED DESCRIPTION
  • Reference will now be made to the exemplary embodiments illustrated in the drawings, and specific language will be used herein to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Alterations and further modifications of the inventive features illustrated herein, and additional applications of the principles of the inventions as illustrated herein, which would occur to one skilled in the relevant art and having possession of this disclosure, are to be considered within the scope of the invention.
  • In the past, computer graphics modeling has not included the ability to dynamically incorporate geometry elements at run-time into background geometry or a global geometry. One problem to be overcome in order to directly incorporate an additional piece of geometry into the background geometry is that the inset geometry needs to know specific things about the background geometry in order to be arranged, clipped and combined with the background geometry. The computations required to correctly insert an inset geometry into a background geometry are complex and time consuming. Thus, these computations have been performed in the past at compile time. In addition, there is a quantization or sampling problem that can occur at the boundary of an inset geometry that is being added to the background geometry. The quantization problems can create visual anomalies and cracks at boundaries between the inset geometry and the background geometry. The present invention overcomes these problems and allows an inset geometry to be included in a background geometry in real time.
  • The present invention discloses a video graphics display system that is able to merge a smaller inset geometry at run time into a larger background geometry. The inset geometry can replace a section of the background geometry and a gradual transition from the geometric behavior of the larger section to the geometric behavior of the smaller section is achieved in an advantageous way. The transition allows a composite geometry created from the two separate geometries to be visually continuous and to avoid cracks between the two geometries during display. The relative positions of the inset geometry and larger background geometry need not be known until run time in the present invention.
  • Visual continuity of the newly formed composite geometry is achieved, in part, by constructing a “skirt” around the perimeter of the inset geometry. The skirt comprises a set of polygons that extend from the perimeter of the inset geometry outward a given distance. The distance of the skirt between the inset geometry and the background geometry need not be constant in all directions. Skirt polygons have vertices that are completely and exactly coincident with the inset geometry along the perimeter of the inset. This method and system can be implemented within the computer graphics pipeline or in more specific embodiments in the geometry subsystem.
  • The present invention will be described more generally in relation to FIGS. 2-5 and then further details will be discussed later along with FIGS. 6-10. The addition of an inset geometry into a background geometry typically means that the background geometry model will have a certain geometric portion removed, as required for each specific situation. An example of an insert being made into background geometry is shown in FIGS. 2-5. The term background geometry is generally used here to mean a larger geometry or a global geometry within which a smaller geometry can be inset or inserted. In terms of the present invention, this means a smaller piece of inset geometry can be inserted into a larger piece of background geometry in real time without requiring a recompilation of the entire geometry model database. For example, an airport, army base, city, or any group of buildings can be inserted into a larger terrain. On a smaller scale, a crater or destroyed building portion can be inserted into the geometry of a building, dam, or group of buildings. The background geometry can also be described as the primary geometry or global geometry.
  • FIGS. 2-5 illustrate the process of incorporating a new inset geometry within a background geometry. For example, a terrain 100 may be the background geometry. Alternatively, the background geometry might be a building, dam, or mountain that incorporates a geometric displacement (e.g. by an explosion) and requires new inset geometry. Another situation that might require dynamic inset geometry would be a changing geometry, such as bulldozer making cuts to the landscape in real time. In the case of an airport on mostly smooth or level terrain, the inset geometry can be the portion of the geometry inside the airport fence. For more complex images or models that involve insets on cuts and fills, some additional surrounding terrain may be included.
  • The model with a portion removed 108 is shown in FIG. 3 and it requires additional geometry and control information. One or more polygons are used to define the footprint of the inset geometry. The footprint is used to define an opening cutter 104 (FIG. 2) used to cut a hole or opening 106 in the background geometry 102 or global geometry. For example, the opening cutter can define an area of a terrain hole to accommodate an inset geometry 110, as in FIG. 4. The cutting can be implemented by a hierarchical suppression of geometry facets based on whether they are completely surrounded by the hole cutter 104 or implemented in other ways known to those skilled in the art. During the defining of the cutter footprint and the cutting of the background geometry, polygon facets that are not completely surrounded by the cutter (that straddle the cutter boundary) are divided into polygon fragments by the cutter boundary so that all the background polygons will be either in or out of the cutter boundary.
  • A skirt of polygons 114 around the perimeter of the inset geometry is created so the outside vertices conform to the background geometry and the inside vertices are coincident with the inset geometry 110, as shown in FIG. 5. The outside skirt vertices are shared with the vertices of the hole cutter 104. The skirt 114 is conformed to the background around the skirt's outside perimeter, and the skirt is conformed by clipping it to the underlying background polygons. This creates additional vertices along the inside edge of the skirt, where the skirt joins the inset geometry. These additional vertices lie geometrically along the original, unclipped inside edges of the skirt. Other, additional vertices created during the clipping operation that lie between the inside and outside edges of the skirt get a blend of behavior between the inside and outside skirt vertices.
  • Now that the invention has been described generally, further details will be presented with respect to the method for the invention. FIG. 6 illustrates a flow chart depicting operations performed in an embodiment of the invention. A skirt is constructed around the perimeter of the inset geometry in block 200. The skirt polygons extend outward from the inset geometry a pre-selected distance that is not necessarily constant in all directions. This skirt shares vertices completely and exactly with the inset geometry along the perimeter of the inset. The combination of the inset geometry with the skirt creates an extended inset.
  • Next, the portion of the background geometry overlaid by the extended inset is removed. This is done by constructing a cutter polygon or polygons from the outside perimeter vertices of the skirt in block 202. The cutter polygon 212 is illustrated in FIG. 7 as it overlays the background geometry 214. The cutter polygon is clipped into a contiguous mesh of cutter fragments 216, where each fragment is wholly and exactly contained within a background polygon of the background geometry that it overlays and is geometrically coplanar with its associated background polygon. The initial cutter polygon is not necessarily in a plane and it will be divided into polygons or triangles to match the actual background geometry. In addition, the cutter polygon may be applied by rendering it into the pixel frame buffer as a stencil mask to remove background polygons and portions of background polygons covered by the cutter.
  • In the embodiment described, there is a “projection direction” of the cutter polygon 212 onto the background polygons 214. For background geometry, such as a simulated terrain, the cutter polygons are projected vertically onto the background geometry and clipped against vertical clip planes defined by the edges of the underlying background polygons.
  • Each cutter fragment can then be associated with the background polygon it overlays so that subsequent graphics operations can properly apply cutters to background polygons in stencil-like operations. In addition, any background polygon that is completely surrounded by a cutter can be discarded, along with its overlaid cutter polygon, reducing subsequent processing and rendering effort. FIG. 8 illustrates that the application of the cutter polygon 212 generates a hole or opening in the background geometry that is precisely the shape of the geometry of the inset geometry 222 and the skirt 220 to be inserted. The size of the geometry removed by the cutter polygon is greater than the inset geometry to allow for the skirt. This means that the inset geometry does not need to know the details about the background geometry and the skirt can act as a geometric blend region. In addition, the inset geometry does not need to be clipped because the skirt will be matched to the inset geometry along their shared perimeter. Avoiding clipping the inset geometry reduces the complexity of inserting any given inset geometry into the background geometry. In this process, neither the background nor the inset geometry are modified which is a valuable advantage of the present invention.
  • The hole cutting technique is designed to utilize the largest possible cutter polygons in order to maximize the ability to detect and discard background geometry facets that are wholly within the region of interest. If multiple cutter polygons are utilized, it is preferable (although not required) that they form a contiguous tiling of the area to be removed or suppressed, with complete vertex sharing. The system need not be able to discern all geometry facets that can be completely removed should they be covered by the collective effects of several cutter polygons. As such, it is more effective when the system utilizes a single convex cutter polygon, even if it requires a large number of vertices. It is more efficient to discard a completely covered background polygon than to render it (along with an associated cutter polygon) and then erase it pixel-by-pixel with a stencil operation.
  • Once the opening in the background geometry is formed, FIG. 9 illustrates that the skirt polygons are clipped against the background polygons they overlay in block 204 (FIG. 6). This generates a set of skirt fragments 230, each of which is entirely contained within one background polygon. In other words, the skirt fragments will be equal in size or smaller than the background polygons in which they are contained and the points at which the background polygons intersect the skirt 232 help define where the fragments will be divided. In most cases, the set of clipped skirt fragments will be greater in number than the original set of skirt fragments. These skirt fragments are not in a plane but will generally conform to the background against which they were clipped and as modified by the blending.
  • Once the skirt polygons are clipped against the background polygons, updated vertex coordinates 232 in the projection direction are computed for the clipped skirt fragment polygons as in 206 (FIG. 6). In the case of a vertical projection direction, this means that new Z or altitude components for the skirt fragments can be computed. The computed component values can be a blend of at least one background attribute and at least one inset attribute at the skirt vertex, where these attributes might include geometry, shading, texture and color data. The blend factor can be based on the relative lateral location of each fragment vertex in the skirt. In other words, vertices on the outside edge of the skirt receive 100% background behavior, vertices on the inside edge of the skirt receive 100% inset behavior, and vertices in between get an appropriate blend of the two. The clipped and conformed skirt thus becomes an integration or transition region between background behavior and inset behavior.
  • FIG. 10 illustrates that a first drop wall 252 of polygons beneath the visible surface of the model is constructed (208 in FIG. 6) after the new vertex coordinates are computed. The top vertices 260 of the drop wall polygons are shared with the original perimeter vertices of the inset geometry. The bottom vertices of these polygons are shifted outward slightly and their visible sides face outward from the inset geometry. As illustrated in FIG. 10, the wall polygons are below the nominal “surface” which means that the drop wall is not visible to the end user.
  • The next step is creating a second drop wall of polygons in 210. Once the first wall is constructed, a second drop wall of polygons 256 is then constructed and its top vertices are shared with the clipped skirt fragment vertices 258 along the perimeter. The bottom vertices of these polygons are shifted slightly inward and their visible sides face inward. At this point, the system then renders all the resultant polygons.
  • These two walls are geometrically coincident along their top edge, but they do not have the same number of polygons, and do not share all vertices. This is significant because the points that join the first and second drop walls to either the inset geometry or background geometry are separate points. This means that although the top vertices of the drop walls are intended to be located at the same numerical point (or along the same geometric edge), the points may shift slightly when the geometry is rotated or translated for viewing. Such shifting is caused by numerical or quantization inaccuracies that are created by a finite accuracy floating point system. In other words, at certain points in time the floating point calculation may be slightly inaccurate and the points 258, 260 will be slightly separated. This allows a user to see the backdrop or error in the geometric model. Not only does the intersection of these two drop walls prevent cracks when there is a lateral separation between the geometries, it also prevents cracks when there is a vertical separation or inaccuracy. This prevents an end user from seeing between or under the perimeter that joins the skirt and inset geometry.
  • The clipped skirt polygons share vertices exactly with the cutter polygons along the outside perimeter, preventing visual cracks along the boundary between the background geometry and the skirt. The first set of drop wall polygons shares vertices exactly with the inside edge of the skirt, and a second set of wall polygons shares vertices exactly with the outside edge of the original inset. Since the walls intersect geometrically (due to the shift of their bottom vertices), any potential visual cracks along the boundary between the original inset and the inside edge of the skirt are geometrically closed off, even if the coincident points 258, 260 are momentarily not coincident. The result is the visually seamless insertion of one area of geometry inside another without any visual anomalies or cracks.
  • Because the inset can have a variable level-of-detail, the skirt and drop wall polygons derived from the inset will also have a variable level-of-detail. In addition, the wall polygons can be constructed so that they are just large enough to cover any potential crack. In general, this means that the height of each wall polygon is computed to subtend about a pixel. This process takes into account the distance of each original inset perimeter vertex from the eye, and computes a vertical drop for the corresponding wall bottom vertex that will (after perspective transformation) cause it to be about one pixel below, so that the wall polygon at that same position is about one pixel high. This minimizes the pixel fill-rate cost of the unseen wall polygons.
  • The cutter polygon or polygons form a hole in the background geometry that exactly accommodates the geometry of the inset geometry. If the underlying areas are not properly concealed, defects are shown during the actual display of the image. As mentioned, these defects can cause visual problems for users who view the final output. These defects include shifts in the polygon boundaries that expose light or the color of any backdrop layer, i.e. a “crack” that lets whatever is behind show through erroneously.
  • The defects can also provide discontinuities that are affected by the virtual instrumentation utilized within the simulation environment. For example, an infrared sensor may detect visual errors as a temperature difference between the desired geometry and the cracks, which can significantly distort the function and information provided to the user during the simulation. This can result in serious errors in performance and training. Further, bright spots may occur that are aesthetically unappealing during the training exercise and actually become distracting to the students utilizing the simulation environment. Since the process of generating images is often done at a rate of 60+ images per second, as the geometry is shifted during the simulation exercise, flickering may occur that distracts the student as well as provides false information to the virtual instrumentation utilized within a simulation system.
  • Accordingly, it is important to cover these regions of possible cracks to prevent their inclusion during a simulation event. The approach of the present invention utilizes vertical walls of polygons constructed along the inner edge of the skirt, which can extend several meters below ground within the global terrain. Since the walls are vertical, they are easily constructed in an automatic fashion from the vertices defining the inner edge of the clipped skirt. To complement this wall, a modeled in place, sloping, below-ground wall is generated as part of the inset geometry. This structure is not clipped to the terrain, but provides a region of double coverage that is large enough to close off any cracks completely. Further, these “below ground” structures will not frequently be seen during the display operation, so they are simplified by omitting texture or shading finesse.
  • An example of how this system can be used is in a flight or vehicle simulator. For example, an airport terrain insert may be desired at a specific point in a global geometry. The present invention allows the airport to be inserted without modifying or recompiling the global geometry. There may also be errors included in the modeling of the airport insert that is desired to be added in real time. The local detail, however, must be matched up along the cut or skirt boundary and divided into versions that can be manipulated to match along the boundary. This matching becomes more difficult if the airport position must be adjusted at run time in order to get each of the several runway thresholds at the right place. The present invention can overcome these problems using a skirt to allow the airport insert geometry to be adjusted slightly to allow the boundaries to match-up without creating visual anomalies.
  • It is to be understood that the above-referenced arrangements are illustrative of the application for the principles of the present invention. Numerous modifications and alternative arrangements can be devised without departing from the spirit and scope of the present invention while the present invention has been shown in the drawings and described above in connection with the exemplary embodiment(s) of the invention. It will be apparent to those of ordinary skill in the art that numerous modifications can be made without departing from the principles and concepts of the invention as set forth in the claims.

Claims (5)

1. A method for integrating an inset geometry within a background geometry, comprising:
identifying a perimeter of the inset geometry;
extending a skirt, having an outer perimeter and an inner perimeter, from the perimeter of the inset geometry out over the background geometry;
removing portions of the background geometry that are covered by the inset geometry and skirt;
modifying the skirt so that the outer perimeter of the skirt matches background geometry behavior and the inner perimeter matches inset geometry behavior and a continuous transition exists between the outer perimeter and the inner perimeter.
2. A method as in claim 1, further comprising the step of providing additional geometric constructs below the nominal surface of the skirt and inset geometry to conceal gaps between the skirt and inset geometry.
3. A method as in claim 2, wherein the step of providing additional geometric constructs further comprises the step of providing a first polygon wall having shared top vertices with the perimeter of the inset geometry and located beneath the surface of the inset geometry, where bottom vertices of the first polygon wall are shifted outward from the inset geometry.
4. A method as in claim 3, wherein the step of providing additional geometric constructs further comprises the step of providing a second polygon wall with top vertices shared with the polygon skirt and the bottom vertices shifted inward toward the inset geometry, whereby any gaps are concealed between the inset geometry and the background geometry during display.
5-32. (canceled)
US10/979,892 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry Abandoned US20050088456A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/979,892 US20050088456A1 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/268,528 US6816169B2 (en) 2002-10-09 2002-10-09 System and method for run-time integration of an inset geometry into a background geometry
US10/979,892 US20050088456A1 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/268,528 Division US6816169B2 (en) 2002-10-09 2002-10-09 System and method for run-time integration of an inset geometry into a background geometry

Publications (1)

Publication Number Publication Date
US20050088456A1 true US20050088456A1 (en) 2005-04-28

Family

ID=32068587

Family Applications (5)

Application Number Title Priority Date Filing Date
US10/268,528 Expired - Lifetime US6816169B2 (en) 2002-10-09 2002-10-09 System and method for run-time integration of an inset geometry into a background geometry
US10/980,123 Expired - Lifetime US7053913B2 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry
US10/979,892 Abandoned US20050088456A1 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry
US10/979,443 Expired - Lifetime US7053912B2 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry
US10/979,442 Expired - Lifetime US7053911B2 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US10/268,528 Expired - Lifetime US6816169B2 (en) 2002-10-09 2002-10-09 System and method for run-time integration of an inset geometry into a background geometry
US10/980,123 Expired - Lifetime US7053913B2 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry

Family Applications After (2)

Application Number Title Priority Date Filing Date
US10/979,443 Expired - Lifetime US7053912B2 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry
US10/979,442 Expired - Lifetime US7053911B2 (en) 2002-10-09 2004-11-01 System and method for run-time integration of an inset geometry into a background geometry

Country Status (1)

Country Link
US (5) US6816169B2 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7589746B2 (en) * 2006-03-23 2009-09-15 Intel Corporation Optimized frustum clipping via cached clip vertices
US7639249B2 (en) * 2006-05-05 2009-12-29 Microsoft Corporation Direct inset beveling of geometric figures
US20080036911A1 (en) * 2006-05-05 2008-02-14 Robert Noory Method and apparatus for synchronizing a graphics signal according to a reference signal
EP2104930A2 (en) 2006-12-12 2009-09-30 Evans & Sutherland Computer Corporation System and method for aligning rgb light in a single modulator projector
US9188850B2 (en) * 2007-09-10 2015-11-17 L-3 Communications Corporation Display system for high-definition projectors
JP5034806B2 (en) * 2007-09-13 2012-09-26 富士通セミコンダクター株式会社 Graphic drawing apparatus, graphic drawing method, graphic drawing program, and recording medium storing the program
US8358317B2 (en) 2008-05-23 2013-01-22 Evans & Sutherland Computer Corporation System and method for displaying a planar image on a curved surface
US8702248B1 (en) 2008-06-11 2014-04-22 Evans & Sutherland Computer Corporation Projection method for reducing interpixel gaps on a viewing surface
US8077378B1 (en) 2008-11-12 2011-12-13 Evans & Sutherland Computer Corporation Calibration system and method for light modulation device
CA2698052C (en) * 2009-03-30 2021-02-02 Stickeryou, Inc. Internet-based method and system for making user-customized stickers
US11230026B2 (en) 2009-03-30 2022-01-25 Stickeryou Inc. Device, system and method for making custom printed products
US20120233210A1 (en) * 2011-03-12 2012-09-13 Matthew Thomas Bogosian Storage of Arbitrary Points in N-Space and Retrieval of Subset thereof Based on Criteria Including Maximum Distance to an Arbitrary Reference Point
US9641826B1 (en) 2011-10-06 2017-05-02 Evans & Sutherland Computer Corporation System and method for displaying distant 3-D stereo on a dome surface
US20150058390A1 (en) * 2013-08-20 2015-02-26 Matthew Thomas Bogosian Storage of Arbitrary Points in N-Space and Retrieval of Subset Thereof Based on a Determinate Distance Interval from an Arbitrary Reference Point
US10297077B1 (en) * 2016-03-02 2019-05-21 Valve Corporation Hidden area stencil mesh rendering systems and methods

Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3816726A (en) * 1972-10-16 1974-06-11 Evans & Sutherland Computer Co Computer graphics clipping system for polygons
US4616262A (en) * 1983-11-14 1986-10-07 Dainippon Ink And Chemicals, Incorporated Method and apparatus for forming a combined image signal
US4785399A (en) * 1987-03-03 1988-11-15 International Business Machines Corporation Shaping geometric objects by cumulative translational sweeps
US4885703A (en) * 1987-11-04 1989-12-05 Schlumberger Systems, Inc. 3-D graphics display system using triangle processor pipeline
US4994989A (en) * 1987-10-09 1991-02-19 Hitachi, Ltd. Displaying method and apparatus for three-dimensional computer graphics
US5113490A (en) * 1989-06-19 1992-05-12 Silicon Graphics, Inc. Method for forming a computer model from an intersection of a cutting surface with a bounded volume
US5204918A (en) * 1990-06-28 1993-04-20 Dainippon Screen Mfg. Co., Ltd. Method of and apparatus for correcting contour of image
US5313570A (en) * 1993-03-31 1994-05-17 Miles, Inc. Method for determining color boundaries for correcting for plate misregistration in color printing
US5359526A (en) * 1993-02-04 1994-10-25 Hughes Training, Inc. Terrain and culture generation system and method
US5369739A (en) * 1993-07-09 1994-11-29 Silicon Graphics, Inc. Apparatus and method for generating point sample masks in a graphics display system
US5377320A (en) * 1992-09-30 1994-12-27 Sun Microsystems, Inc. Method and apparatus for the rendering of trimmed nurb surfaces
US5428718A (en) * 1993-01-22 1995-06-27 Taligent, Inc. Tessellation system
US5630037A (en) * 1994-05-18 1997-05-13 Schindler Imaging, Inc. Method and apparatus for extracting and treating digital images for seamless compositing
US5715385A (en) * 1992-07-10 1998-02-03 Lsi Logic Corporation Apparatus for 2-D affine transformation of images
US5905502A (en) * 1995-08-04 1999-05-18 Sun Microsystems, Inc. Compression of three-dimensional graphics data using a generalized triangle mesh format utilizing a mesh buffer
US6052129A (en) * 1997-10-01 2000-04-18 International Business Machines Corporation Method and apparatus for deferred clipping of polygons
US6124857A (en) * 1998-08-12 2000-09-26 International Business Machines Corporation Meshing method and apparatus
US6166744A (en) * 1997-11-26 2000-12-26 Pathfinder Systems, Inc. System for combining virtual images with real-world scenes
US6195609B1 (en) * 1993-09-07 2001-02-27 Harold Robert Pilley Method and system for the control and management of an airport
US6262739B1 (en) * 1996-10-16 2001-07-17 Real-Time Geometry Corporation System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities
US6289364B1 (en) * 1997-12-22 2001-09-11 Adobe Systems, Inc. Transparency processing in a page description language
US6307558B1 (en) * 1999-03-03 2001-10-23 Intel Corporation Method of hierarchical static scene simplification
US6330001B1 (en) * 1997-09-10 2001-12-11 Nec Corporation Device and computer-readable record medium for image position adjustment
US6339433B1 (en) * 1994-09-13 2002-01-15 Canon Kabushiki Kaisha Creating a blend of color and opacity between arbitrary edges
US6348919B1 (en) * 1995-12-18 2002-02-19 3Dlabs Inc, Ltd. Graphics system with optimized use of unified local and frame buffers
US6489967B1 (en) * 1998-09-02 2002-12-03 Namco Limited Image formation apparatus and image formation method
US6515675B1 (en) * 1999-11-22 2003-02-04 Adobe Systems Incorporated Processing opaque pieces of illustration artwork
US6577777B1 (en) * 1995-10-11 2003-06-10 Dainippon Screen Mfg. Co., Ltd. Image processing using adjoining relationships between image parts
US20030184563A1 (en) * 2002-03-26 2003-10-02 Kenneth J. Wiant Efficient digital map overlays

Patent Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3816726A (en) * 1972-10-16 1974-06-11 Evans & Sutherland Computer Co Computer graphics clipping system for polygons
US4616262A (en) * 1983-11-14 1986-10-07 Dainippon Ink And Chemicals, Incorporated Method and apparatus for forming a combined image signal
US4785399A (en) * 1987-03-03 1988-11-15 International Business Machines Corporation Shaping geometric objects by cumulative translational sweeps
US4994989A (en) * 1987-10-09 1991-02-19 Hitachi, Ltd. Displaying method and apparatus for three-dimensional computer graphics
US4885703A (en) * 1987-11-04 1989-12-05 Schlumberger Systems, Inc. 3-D graphics display system using triangle processor pipeline
US5113490A (en) * 1989-06-19 1992-05-12 Silicon Graphics, Inc. Method for forming a computer model from an intersection of a cutting surface with a bounded volume
US5204918A (en) * 1990-06-28 1993-04-20 Dainippon Screen Mfg. Co., Ltd. Method of and apparatus for correcting contour of image
US5715385A (en) * 1992-07-10 1998-02-03 Lsi Logic Corporation Apparatus for 2-D affine transformation of images
US5377320A (en) * 1992-09-30 1994-12-27 Sun Microsystems, Inc. Method and apparatus for the rendering of trimmed nurb surfaces
US5428718A (en) * 1993-01-22 1995-06-27 Taligent, Inc. Tessellation system
US5359526A (en) * 1993-02-04 1994-10-25 Hughes Training, Inc. Terrain and culture generation system and method
US5313570A (en) * 1993-03-31 1994-05-17 Miles, Inc. Method for determining color boundaries for correcting for plate misregistration in color printing
US5369739A (en) * 1993-07-09 1994-11-29 Silicon Graphics, Inc. Apparatus and method for generating point sample masks in a graphics display system
US6195609B1 (en) * 1993-09-07 2001-02-27 Harold Robert Pilley Method and system for the control and management of an airport
US5630037A (en) * 1994-05-18 1997-05-13 Schindler Imaging, Inc. Method and apparatus for extracting and treating digital images for seamless compositing
US6339433B1 (en) * 1994-09-13 2002-01-15 Canon Kabushiki Kaisha Creating a blend of color and opacity between arbitrary edges
US5905502A (en) * 1995-08-04 1999-05-18 Sun Microsystems, Inc. Compression of three-dimensional graphics data using a generalized triangle mesh format utilizing a mesh buffer
US6577777B1 (en) * 1995-10-11 2003-06-10 Dainippon Screen Mfg. Co., Ltd. Image processing using adjoining relationships between image parts
US6348919B1 (en) * 1995-12-18 2002-02-19 3Dlabs Inc, Ltd. Graphics system with optimized use of unified local and frame buffers
US6262739B1 (en) * 1996-10-16 2001-07-17 Real-Time Geometry Corporation System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities
US6330001B1 (en) * 1997-09-10 2001-12-11 Nec Corporation Device and computer-readable record medium for image position adjustment
US6052129A (en) * 1997-10-01 2000-04-18 International Business Machines Corporation Method and apparatus for deferred clipping of polygons
US6166744A (en) * 1997-11-26 2000-12-26 Pathfinder Systems, Inc. System for combining virtual images with real-world scenes
US6289364B1 (en) * 1997-12-22 2001-09-11 Adobe Systems, Inc. Transparency processing in a page description language
US6124857A (en) * 1998-08-12 2000-09-26 International Business Machines Corporation Meshing method and apparatus
US6489967B1 (en) * 1998-09-02 2002-12-03 Namco Limited Image formation apparatus and image formation method
US6307558B1 (en) * 1999-03-03 2001-10-23 Intel Corporation Method of hierarchical static scene simplification
US6515675B1 (en) * 1999-11-22 2003-02-04 Adobe Systems Incorporated Processing opaque pieces of illustration artwork
US20030184563A1 (en) * 2002-03-26 2003-10-02 Kenneth J. Wiant Efficient digital map overlays

Also Published As

Publication number Publication date
US7053912B2 (en) 2006-05-30
US20050093882A1 (en) 2005-05-05
US7053913B2 (en) 2006-05-30
US20050093864A1 (en) 2005-05-05
US20040070587A1 (en) 2004-04-15
US6816169B2 (en) 2004-11-09
US20050062761A1 (en) 2005-03-24
US7053911B2 (en) 2006-05-30

Similar Documents

Publication Publication Date Title
US6816169B2 (en) System and method for run-time integration of an inset geometry into a background geometry
Neumann et al. Augmented virtual environments (ave): Dynamic fusion of imagery and 3d models
US5630718A (en) Weather simulation system
US5412796A (en) Method and apparatus for generating images simulating non-homogeneous fog effects
CN110136219A (en) A kind of two three-dimensional map methods of exhibiting based on multisource data fusion
CN111508052B (en) Rendering method and device of three-dimensional grid body
EP2766878B1 (en) Layered digital image data reordering
US20110279470A1 (en) Systems and methods for the real-time and realistic simulation of natural atmospheric lighting phenomenon
US5409379A (en) Weather simulation system
US6943803B1 (en) Anti-aliased, textured, geocentric and layered fog graphics display method and apparatus
WO1998038591A2 (en) Method for rendering shadows on a graphical display
Dollner et al. Real-time expressive rendering of city models
EP0532579B1 (en) Image generator
CA2702741A1 (en) Geospatial modeling system using void filling and related methods
US7257519B2 (en) System and method for weighted correction of an eyepoint position
Bergen et al. Data-driven simulation, dimensional accuracy and realism in a landscape visualization tool
Kennie et al. Modelling for digital terrain and landscape visualisation
US6940504B1 (en) Rendering volumetric fog and other gaseous phenomena using an alpha channel
CN115690344A (en) Sponge city sand table and weather simulation system
Vyatkin et al. Database components for visual systems
EP1156455A2 (en) Fog visual effect rendering method
Vince The art of Simulation
Ozimek et al. Computer-Aided Method of Visual Absorption Capacity Estimation
Feibush et al. Geo-spatial visualization for situational awareness
CN115496884A (en) Virtual and real cabin fusion method based on SRWorks video perspective technology

Legal Events

Date Code Title Description
AS Assignment

Owner name: EVANS & SUTHERLAND COMPUTER CORPORATION, UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:COSMAN, MICHAEL A.;REEL/FRAME:015957/0743

Effective date: 20021007

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION