US20090305782A1 - Double render processing for handheld video game device - Google Patents
Double render processing for handheld video game device Download PDFInfo
- Publication number
- US20090305782A1 US20090305782A1 US12/136,563 US13656308A US2009305782A1 US 20090305782 A1 US20090305782 A1 US 20090305782A1 US 13656308 A US13656308 A US 13656308A US 2009305782 A1 US2009305782 A1 US 2009305782A1
- Authority
- US
- United States
- Prior art keywords
- layer
- rendered
- information
- memory
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A63F13/10—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/204—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/301—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device using an additional display connected to the game console, e.g. on the controller
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
Definitions
- the present invention relates generally to handheld devices, and more particularly to image rendering for a handheld video game device.
- Video games provide a source of enjoyment to users by allowing users to engage in simulated scenarios and situations the users may not otherwise be able to experience. Video games receive different types of interactive user inputs, and process the inputs into vibrant interactive visual displays and audio accompaniments for the users to enjoy.
- Handheld video game devices or other mobile devices providing video game functions, are often preferred over traditional video game consoles due to their convenience and mobility. Because of the relatively small size of most handheld video game devices, handheld video game devices allow for easy transport and playing flexibility in environments which would typically be unsuitable for video game play using traditional video game consoles.
- the tradeoff for the small size and mobility of handheld video game devices is generally manifested in the processing power and video display capabilities of the handheld video game devices.
- the relatively small housings of handheld video game devices does not allow for the capacity of hardware and processing power of traditional video game consoles.
- the smaller platforms allow for only limited screen sizes, further reducing the video display capabilities of handheld video game devices. While recent years have seen marked improvements in the video display capabilities in a number of handheld video game device platforms, generally the capacity of video displays in handheld video game devices still falls far short of video display capabilities of more traditional video game consoles.
- the invention provides for displays of a video game.
- the invention provides a method of providing images for a video game, comprising: associating objects with either a first layer or a second layer; rendering objects associated with the first layer; rendering objects associated with the second layer; displaying the rendered objects associated with the first layer on a display; and displaying the rendered objects associated with the second layer on the display.
- the invention provides a method of providing images for a music based video game, comprising: associating a first object with a first display layer, the first object representative of a musician in the music based video game; associating a plurality of background objects with a second display layer, the plurality of background objects representative of a venue in the music based video game; iteratively rendering the first object, storing rendered information of the first object in a first memory, and displaying rendered information of the first object on the display; iteratively rendering the plurality of background objects, storing rendered information of the plurality of background objects in a second memory, and displaying rendered information of the plurality of background objects on the display; with displaying rendered information of the first object on the display utilizing the information stored in the first memory in first alternating time periods and displaying rendered information of the plurality of background objects utilizing the information stored in the second memory in second alternating time periods, the first alternating time periods and the second alternating time periods occurring at different times.
- the invention provides a handheld game system, comprising: memory storing scene data, the scene data including first scene data and second scene data; a processor configured to render the first scene data and the second scene data; first video memory, coupled to the processor, configured to store rendered first scene data; second video memory, coupled to the processor, configured to store rendered second scene data; and a display coupled to the first video memory and the second video memory; the processor being further configured to alternately: a) render the first scene data, command display on the display of the rendered first scene data, command display on the display of the rendered second scene data in the second video memory, and command storage of the rendered first scene data in the first video memory, and b) render the second scene data, command display on the display of the rendered second scene data, command display on the display of the rendered first scene data in the first video memory, and command storage of the rendered second scene data in the second video memory.
- the invention provides a method of providing images for a video game, comprising: associating different objects with different layers; rendering objects associated with a first layer of the different layers; storing information of rendered objects associated with the first layer in a first memory; rendering objects associated with a second layer of the different layers; combining the information of rendered objects associated with the first layer in the first memory with information of rendered objects associated with the second layer; and displaying the combined information.
- FIG. 1 illustrates a handheld video game system in accordance with aspects of the invention
- FIG. 2 is a block diagram of example video processing circuitry for a display of a handheld video game device in accordance with aspects of the invention
- FIG. 3 is a flow diagram of a process of image rendering in a handheld video game device in accordance with aspects of the invention
- FIG. 4 is a timeline of an image layer alternating process in accordance with aspects of the invention.
- FIG. 5 is an illustration showing an image combining process in accordance with aspects of the invention.
- FIG. 6 is a block diagram of a handheld video game device in accordance with an embodiment of the invention.
- FIG. 7 is a flow diagram of a process of using separately rendered image layers in accordance with aspects of the invention.
- FIG. 8 illustrates a flow of using separately rendered image layers in accordance with aspects of the invention
- FIG. 9 illustrates different levels of detail of rendered features in accordance with aspects of the invention.
- FIG. 10 is a flow diagram of a further process of using separately rendered image layers in accordance with aspects of the invention.
- FIG. 11 illustrates a further flow of using separately rendered image layers in accordance with aspects of the invention.
- FIG. 1 is an example of a handheld video game system in accordance with aspects of the invention.
- the handheld video game system includes a handheld video game device 101 , including at least one display, at least one user input device, at least one speaker, and at least one removable memory interface.
- Each handheld video game device also includes internal circuitry generally associated with video game devices, such as processing circuitry for executing video game instructions and memory used to store video game information.
- another handheld device capable of providing video game functions may be used instead of the handheld video game device of FIG. 1 .
- the handheld video game device is a Nintendo DS handheld video game device or a Nintendo DS Lite handheld video game device, both widely available in consumer electronics retail stores.
- the handheld video game device incorporates a clamshell design, with a hinge 103 allowing for closure of the handheld video game device, thereby protecting the external components of the handheld video game device while in the closed position.
- the handheld video game device includes two 3-inch displays, each with a resolution of 256 ⁇ 192 pixels, with a first display 105 located on the top portion 107 of the clamshell housing, and a second display 109 located on the bottom portion 111 of the clamshell housing.
- the bottom display may include touchscreen input capabilities.
- speakers 113 are located on each side of the top display, while a digital directional pad input 115 is located to the left of the bottom display, and a plurality of digital input buttons 117 is located to the right of the bottom display.
- additional user inputs may be available on the handheld video game device.
- the handheld video game device also includes two removable memory interfaces.
- the first removable memory interface 119 is generally configured to read a removable memory cartridge holding video game instructions.
- the second removable memory interface 121 is configured to interact with either a removable memory cartridge used in conjunction with an older handheld video game device platform, or a peripheral used in conjunction with video game play of certain video games.
- the handheld video game system illustrated in FIG. 1 also includes a peripheral input device 123 with a plurality of additional input buttons 125 .
- the peripheral input device may be connected to the handheld video game device via the second removable memory interface.
- the peripheral input device is used in conjunction with music based rhythm video game.
- the top display is displaying a screenshot from game play of a music based video game.
- the graphics processing unit of the handheld video game device in FIG. 1 may be, for example, capable of rendering up to 2048 polygons per frame per image, at a rate of 60 frames per second.
- the screenshot displayed in the top display of FIG. 1 includes double the allowable number of polygons, with for example a maximum display capacity of 4096 total polygons.
- each display of the handheld video game device is capable of displaying at least two separate image layers at the same time.
- the resulting composite image may be double the traditional image resolution, although the effective frame rate may be reduced.
- a reduced frame rate may be acceptable, as the new frame rate may be comparable with television and movie frame rates, which typically have refresh rates of between 24 and 30 frames per second.
- FIG. 2 is an example of a block diagram of the video processing circuitry for a display of a handheld video game device in accordance with aspects of the invention.
- the block diagram of FIG. 2 may represent, for example, the circuitry associated with the top display of the handheld video game device as illustrated in FIG. 1 .
- Video display information is generated by processing video game instructions and user inputs.
- a main processor housed in the handheld video game device may begin the video generation process by, for example, compiling and separating scene data and associated video generation information from other video game instructions, and storing the information into memory.
- scene data memory 211 The scene data and other video generation information is stored in scene data memory 211 .
- scene data memory may be included as an allocated portion of a main memory in the handheld video game device.
- scene data memory may be separate memory in the handheld video game device, allocated specifically for storage of scene data and other video generation information.
- a graphics processing unit 213 retrieves scene data and other video generation information stored in the scene data memory.
- the graphics processing unit processes the information from scene data memory and renders images, for example 2D or 3D images, for display on the video display based on the information.
- the graphics processing unit in the handheld video game device performs image rendering at a rate of 60 images per second.
- the graphics processing unit may alternate image rendering between what may be considered a front layer and what may be considered a back layer.
- the front layer may be used, exclusively in some embodiments, to render images of a lead character or characters, for example a simulated lead guitarist and a simulated lead singer, while the back layer may be used to render images of a background environment, for example a venue, other band members, and remaining image details.
- the front layer may be considered on top of the back layer, thereby blocking display of portions of the back layer.
- the image information for the back layer is occluded.
- images rendered by the graphics processing unit are sent directly to a display screen 215 for immediate display purposes, and are simultaneously captured and saved to a bank of video memory 217 .
- the video memory is memory dedicated to storing video processing information, and is generally capable of storing multiple completed video images at any given time.
- the graphics processing unit renders images and sends the rendered images simultaneously to the display and video memory.
- a video memory A, or VRAM A 219 may be dedicated to storing front layer images rendered by the graphics processing unit, and a video memory B, or VRAM B 221 , may be dedicated to storing back layer images rendered by the graphics processing unit.
- subsequently rendered front layers overwrite a previous front layer stored in VRAM A
- subsequently rendered back layers overwrite a previous back layer stored in VRAM B. Therefore, when a front layer is rendered by the graphics processing unit, the front layer is updated and replaced on the display and in VRAM A, while the back layer displayed is the back layer previously stored in VRAM B. In the next frame, the graphics processing unit renders a new back layer, and the back layer is updated and replaced on both the display and in VRAM B, while the front layer displayed remains the front layer stored in VRAM A from the previous frame.
- the graphics processing unit may continually alternates rendering front layers and back layers in this manner. Therefore, on even frames, for example, the display of the handheld video game device may display a newly updated front layer, and a back layer reused from the previous frame and retrieved from VRAM B. On odd frames, for example, the display of the handheld video game device may alternatively display a newly updated back layer, and a front layer reused from the previous frame and retrieved from VRAM A.
- the graphics processing unit of the handheld video game device may only be capable of rendering image layers with 2048 polygons. However, the graphics processing unit may be capable of combining two previously rendered image layers into one composite image including more than 2048 polygons.
- the video memory associated with each graphics processing unit is capable of storing images containing more than 2048 polygons
- the display is capable of displaying images containing more than 2048 polygons.
- the handheld video game device is therefore capable of displaying videos with double the original resolution capacity of the handheld video game device.
- FIG. 3 is a flow diagram of a process of image rendering in a handheld video game device in accordance with aspects of the invention.
- the process may be performed, for example, using the graphics processing unit of the top display in the embodiment as described in FIG. 1 .
- the process associates objects with a layer.
- some objects may be associated with a first layer and some objects may be associated with a second layer.
- association of an object with a layer may be accomplished by storing information in memory correlating objects with a particular layer.
- the information may be stored in a table, for example, in memory separate from the objects.
- Scene data may include, for example, video game instructions from a removable memory including information for running the particular video game being played.
- Scene data may also include, for example, user inputs generated through video game play, retrieved from, for example, user input apparatuses built into the handheld video game device, or from, for example, a peripheral device as illustrated in FIG. 1 , or from, for example, input signals from another handheld video game device via a wireless connection interface.
- Some data may also originate from the main processor of the handheld video game device, which also processes video game instructions and user inputs to generate, for example, video game states associated with the proper functionality of the video game, or from the main memory of the handheld video game device, which stores generated video game states that may include information on image rendering.
- the process usually by way of a graphics processing unit, renders objects within an image
- the process renders objects associated with different layers at different times. For example, using an example with two layers, the process may render objects associated with a first layer during a first time period, render objects associated with a second layer during a second time period, and repeat the rendering of objects in different layers in an alternating manner, thereby effectively rendering a first layer of an image and a second layer of an image in an alternating manner.
- the process stores the information associated with each object generated during the image rendering process, or in other words, stores results of rendering objects.
- rendered object information for different layers are stored in different memory.
- the process may also command display of the rendered objects.
- the process may generate a variety of information pertaining to each polygon. For example, the polygon shape and color is generated in block 313 , and the polygon layer identification information and exact display location of the polygon within the layer is generated in block 315 . After the rendered object information has been generated and compiled, it is stored in memory until the entire image layer has been successfully rendered.
- the object information may be stored in the video memory associated with the graphics processing unit, or alternatively, the object information may be temporarily stored in the main memory of the handheld video game device.
- An image layer may be considered to be successfully rendered when all the objects to be rendered associated with the layer have been compiled, and the graphics processing unit can use the compilation of object information to render a completed image layer.
- the process performs the operations of block 311 , relating to association of objects with layers, prior to or when storing game data on, for example, a game cartridge or other memory storing video game instructions and data, and the process may thereafter repetitively perform the operations of blocks 313 , 315 , and 317 during game play.
- the process afterwards returns.
- the process may be repeated based on the object generation progress of the image layer being rendered and on the image rendering requirements of the graphics processing unit.
- FIG. 4 is a timeline of the image layer alternating process in accordance with aspects of the invention.
- image A 1 411 representing the first front layer, is rendered and sent to the display when the process begins at frame 0 .
- a 1 is also stored into VRAM A, the video memory slot allocated for storage of the most recent front layer image.
- frame 1 image B 1 413 , representing the first back layer, is rendered and displayed on the video display, and stored into VRAM B, the video memory slot allocated for storage of the most recent back layer image.
- the display of the handheld video game device is capable of displaying at least two image layers at the same time. Therefore, during frame 1 , image A 1 is not replaced in VRAM A, and is instead recycled and combined with the new image B 1 into a composite image making up the complete screenshot.
- image A 2 415 representing the second front layer
- frame 2 is rendered and displayed on the video display, while simultaneously stored into VRAM A.
- image A 2 overwrites and replaces the previous image A 1 , so that there is only one front layer image stored in VRAM A at any given time.
- Inage B 1 which is still stored in VRAM B, is reused, and image A 2 is rendered on top of image B 1 , completing the screenshot.
- image B 2 417 representing the second back layer, is rendered and displayed on the video display, and stored into VRAM B, overwriting the previous image B 1 .
- This process is similar to the storage process associated with VRAM A during frame 2 .
- the new image B 2 is layered with image A 2 to create the composite screenshot at frame 3 .
- this process may be repeated, so that a new front layer image is rendered and stored on every even frame, and a new back layer image is rendered and stored on every odd frame.
- each layer, and consequently each complete screenshot is only re-rendered 30 times per second.
- a refresh rate of 30 images per second is on par with, for example, television and movies, which typically employ a refresh rate of between 24 and 30 images per second.
- FIG. 5 is an illustration showing the image combining process in accordance with aspects of the invention.
- Layer A is a rendered front layer 511 , including a lead singer 513 and a lead guitarist 515 .
- Layer A can display a maximum of 2048 polygons. As the layer A illustrated does not include any background imagery, such as for example, backup singers and venue imagery, the 2048 polygons are used exclusively for rendering of the lead singer and the lead guitarist. In contrast, when the entire screenshot is rendered as one image, the 2048 polygons are used for the whole image, including the lead singer, the lead guitarist, and all of the associated background imagery.
- the invention allows for the lead singer and lead guitarist to be rendered in much greater detail than would be possible when the entire screenshot is rendered as one image.
- layer B is a rendered back layer 517 , including all the background imagery associated with the screenshot. Similar to layer A, layer B can display a maximum of 2048 polygons as well. In this fashion, the background imagery may utilize an increased number of polygons, as the polygons of the 2048 polygons that would otherwise be associated with the lead singer and/or lead guitarist can be used instead to enhance details in the background imagery.
- the rendered layers A and B are combined into one composite screenshot 519 including both layers.
- the composite layer thereby has a maximum rendering capability of 4096 polygons, equivalent to double the video display resolution of traditional image rendering on similar handheld video game devices.
- the portions of layer B where features also exist in layer A are displayed as null 521 , creating silhouette shapes of the lead guitarist and lead singer in the illustrated layer B.
- pixels from layer B are occluded if the pixels already contain objects in layer A, so that layer B contains null spaces upon rendering similar to the null spaces illustrated in FIG. 5 .
- layer B may be a fully rendered image layer, with the aforementioned null spaces also filled in with background image details.
- front layer A may be laid atop back layer B for every frame, before being sent to the display, thereby covering and occluding all layer B pixels where layer A objects already exist.
- FIG. 6 is an example of a block diagram of the internal circuitry of a handheld video game device in accordance with another embodiment of the invention.
- the internal circuitry includes a bus 601 coupling together a processor 603 , a main memory 605 , a graphics processing unit for a first display 607 and a graphics processing unit for a second display 609 , an audio driver 611 , and a plurality of different interfaces.
- the interfaces may include a user input/output (I/O) interface 613 , a removable memory interface 615 , a peripheral interface 617 , and a wireless communication interface 619 .
- I/O user input/output
- the graphics processing unit for the first display is coupled to a first allocation of video memory 621 dedicated to the first display through a dedicated bus 623 .
- the graphics processing unit for the second display is also coupled to a second allocation of video memory 625 dedicated to the second display through another dedicated bus 627 .
- the graphics processing units may be integrated into the processor, so that the processor performs all graphics processing tasks. In other embodiments, there may be multiple processors, with each processor having separate data bus connections.
- Handheld video game devices generally integrate displays, speakers, and user inputs directly into the handheld video game device.
- FIG. 6 incorporates the integrated components, with the audio driver coupled to the speakers 629 in the handheld video game device, the graphics processing unit and video memory for a display 1 coupled to display 1 631 , the graphics processing unit and video memory for a display 2 coupled to display 2 633 , and the user I/O reflected as an integrated component rather than as an interface.
- the removable memory interface of the handheld video game device is configured to communicate with a removable memory, for example, a video game cartridge providing video game instructions related to the operation of a specific video game.
- the processor executes the video game instructions from the removable memory by communicating with each component, including the removable memory, via the bus.
- the main memory receives and stores information from the other components as needed for the video game to run properly. Stored information may include, for example, video game play instructions, input processing instructions, audio and video generation information, and configuration information from the removable memory, as well as user inputs from either the user I/O or the peripheral interface.
- the processor adjusts the video game's audio and video properties based in large part on a combination of video game processing instructions from the removable memory and the user inputs.
- a processor may also receive additional video game instructions and inputs from other handheld video game devices via the wireless communication interface, for example, during wireless multiplayer game play.
- the processor of the handheld video game device receives and processes the video game instructions and inputs, and generates audio and video information for the video game based on the instructions and inputs.
- the audio driver is configured to receive the audio information from the processor, and to translate the audio information into audio signals to be sent to the speakers.
- the graphics processing unit renders images with a maximum resolution of 2048 polygons per image, at a frame rate of 60 images per second.
- the graphics processing unit for each display is configured to retrieve video generation information, and to translate the video generation information into display images to be sent to the display coupled to the graphics processing unit, to the video memory coupled to the graphics processing unit, or to both.
- each display may be configured to receive the display images from both a graphics processing unit and a video memory.
- each display may retrieve image information from both the display's graphics processing unit and the display's video memory substantially simultaneously. If two images, each with a maximum rendering capacity of 2048 polygons, are combined and displayed together, the resulting composite image would consequently have a maximum resolution of 4096 polygons, double the original image rendering capacity.
- FIG. 7 is a flow diagram of a process of rendering separate image layers in accordance with aspects of the invention.
- the process determines whether to perform image rendering. If the process determines not to render images, the process proceeds to block 729 , and determines whether to exit the system or reinitiate the image rendering process. If, however, the process proceeds with image rendering, the image rendering process is performed and repeated.
- the process determines whether to render a first image layer A or a second image layer B.
- the first image layer A may be a front layer
- the second image layer B may be a back layer, where the front layer is always layered atop the back layer, and features of the front layer occlude features of the back layer at pixels where objects are rendered for both layers.
- certain features associated with each screenshot will be grouped into either a front layer A or a back layer B. For example, in the embodiments associated with the invention, polygons used to generate images of the lead singer and the lead guitarist are grouped into front layer A, and the remaining polygons, which may be used to generate image details including backup singers or venue, may be grouped into back layer B.
- polygons falling within a predefined portion of the display may be included in layer A, and polygons falling outside the predefined portion of the display, the right half of the display image in this example, may be included in layer B.
- the process may include a render index to help determine whether to render layer A or layer B.
- the initial render index value may be arbitrary, or may be preset to either layer A or layer B. If layer A is to be rendered, the process proceeds to block 715 . Similarly, if layer B is to be rendered, the process proceeds to block 721 .
- the process renders image layer A.
- the graphics processing unit retrieves video information associated with features included in image layer A, and processes the video information to generate polygon information for the image layer A features.
- Video information associated with image generation may be initially processed by the processor associated with the handheld video game device, and may include video game information stored in the removable memory and/or user input signals originating from input buttons, an attached peripheral, or a wireless communication interface. The video information may be temporarily stored in, and retrieved by a graphics processing unit from, the main memory of the handheld video game device.
- the process includes rendering a front layer A, including polygons used to generate a simulated lead singer and a simulated lead guitarist with respect to a music based video game.
- different features of video display images may be grouped with and rendered as image layer A.
- the process stores image layer A.
- Storage space may be provided by a video memory connected to the graphics processing unit.
- the video memory may have the capacity to store and sort multiple images simultaneously.
- the video memory is capable of storing at least two images in two separate memory allocations at any given time.
- the process stores the rendered image layer A into one of the memory slots, which may be labeled, for example, video memory slot A.
- the rendering process of block 715 and the storing process of block 717 may be performed in conjunction with each other. In other words, when an object in an image layer is rendered during the rendering process, the newly rendered object may immediately be stored into the associated video memory slot before another object in the image layer is rendered. In other embodiments, the entire image layer may be rendered before storage of the completed image layer into the video memory slot A. If video memory slot A is occupied with a previously rendered image layer A, the previously rendered image layer A is overwritten and replaced by the newly rendered image layer A.
- the process displays the rendered image layer A and a second image layer B stored in a second video memory slot dedicated to holding rendered layer B images.
- the process may combine the two image layers into one composite image before sending the display information to the video display. Alternatively, the process may send the image layers separately, and layer the images on top of one another on the display. If the process is in its first iteration, and no image layer B has yet been rendered, the process may display the rendered image layer A alone, or alternatively, the process may display a blank image layer B.
- the process renders image layer B.
- the rendering process closely mirrors the rendering process for image layer A as described in block 715 .
- the graphics processing unit retrieves video information associated with features included in image layer B, and processes the video information to generate polygon information for the image layer B features.
- the features included in image layer B may be the features in a typical screenshot of the video game which were not rendered in image layer A.
- the image layer B may be a back layer B, which includes polygons associated with background imagery, for example, background singers, the remaining band members, and the venue.
- the process stores image layer B.
- the storing process again closely mirrors the storage process for image layer A as described in block 717 .
- Image layer B may be stored in a separate memory allocation in a video memory slot B, or a similar memory allocation dedicated to the storage of newly rendered layer B images.
- the image layer B storing process in block 723 may be performed in conjunction with the image layer B rendering process in block 721 .
- an entire image layer B may be rendered before storage into the video memory slot B. If video memory slot B already holds a previously rendered image layer B, the previously rendered image layer B is overwritten and replaced by the newly rendered image layer B.
- the process displays on a video display a composite image including the rendered image layer B and the image layer A stored in video memory slot A.
- the graphics processing unit recombines the rendered image layer B and the stored image layer A before sending the composite image to the video display.
- the image layer defined to be the front layer generally image layer A as has been described herein, may be layered atop the image layer defined to be the back layer, generally image layer B in the described embodiments, thereby occluding objects in the back layer.
- pixels where both layers have object information will display the pixel information of the front layer, and the object information for the pixel included in the back layer will not be displayed.
- the process swaps the render index. If image layer A was rendered and stored in the previous iteration, in other words, if the process performed the tasks associated with blocks 715 , 717 , and 719 , the render index is switched from A to B. Likewise, if image layer B was rendered and stored in the previous iteration, meaning the process performed the tasks associated with blocks 721 , 723 , and 725 , the render index is switched from B to A. Therefore, upon the next iteration of the process, the process will render the image layer which was not rendered in the previous iteration of the process. Swapping the image render index allows for image layer A and image layer B to be alternately rendered, thereby allowing for generation of a completely original screenshot every two iterations, and 30 times every second for embodiments where rendering is performed at a rate of 60 images per second.
- the process determines whether to exit the image rendering process. If the process determines to remain in image rendering, the process cycles back to render either a new image layer A or a new image layer B, depending on the current render index. If the process determines to exit image rendering, the process returns.
- FIG. 8 is an illustration of an embodiment of rendering separate image layers in accordance with aspects of the invention.
- a front layer 811 and a back layer 813 are used to generate a composite image 815 making up a screenshot of the video game.
- the front image layer may include particular features of the screenshot, for example, polygons used to render the lead singer 817 and the lead guitarist 819 in the music based video game.
- the back image layer may include other features of the screenshot, for example, polygons used to generate the remaining aspects of the screenshot not rendered in the front image layer.
- the back layer may include polygons used to render the backup singers, the remaining members of the band, and the stage or venue.
- Separate image layer rendering allows for more definition within each image layer, as the maximum number of polygons may be dedicated to the features of each layer rather than allocated across an entire screenshot of the video game.
- the front image layer may dedicate all 2048 polygons in the image layer to rendering the lead guitarist and the lead singer, rather than allocating only a fraction of the 2048 polygons to the two main characters and using the rest of the polygons to generate the simulated surroundings around the two main characters.
- an image A 821 is rendered by the graphics processing unit of the handheld video game device.
- the newly rendered image A is both stored into a video memory slot A 823 , and used to generate the composite image making up a screenshot of the video game.
- the image A may be used as a front layer in the composite image screenshot, and may include, for example, polygons used to render the lead singer and lead guitarist, as has been previously described.
- the graphics processing unit also retrieves a stored image B stored in a video memory slot B 825 .
- FIG. 825 In the embodiment as illustrated in FIG.
- the retrieved image B may be used as a back layer in the composite image screenshot, and may include, for example, polygons used to render features of the screenshot not rendered in image A.
- the newly rendered image A and the retrieved image B are combined into a composite image and sent to the video display as a complete screenshot of the video game.
- a new image B 827 is rendered by the graphics processing unit.
- the newly rendered image B is used in the composite screenshot of the video game as the back layer of the composite screenshot.
- the newly rendered image B is also stored into video memory slot B for later retrieval and use, for example, for composite image generation during the even frames.
- the graphics processing unit also retrieves the image A which was stored in the video memory slot A during the previous even frame. The graphics processing unit combines the retrieved image A and the newly rendered image B into a composite image, and sends the composite image to the video display as a complete screenshot of the video game.
- the alternating rendering process may be completed repeatedly at a rate of 60 frames per second, generating a new image A on even frames and a new image B on odd frames, thereby refreshing the entire screenshot every two frames.
- the effective refresh rate of the video display in a handheld video game device which renders 60 images per second is therefore 30 new screenshots per second.
- FIG. 9 is an illustration demonstrating different available levels of detail of rendered features in accordance with aspects of the invention.
- the first image 911 is an example of an image layer, a front image layer in most embodiments of the invention as described herein, and includes two characters from a music based video game.
- the characters may represent a lead guitarist 913 and a lead singer 915 .
- 2048 polygons are used to render the two characters.
- the lead guitarist may be rendered using a predefined portion of the available polygons, for example, 60% of the available polygons.
- the lead singer may be rendered using the remaining polygons not used to render the lead guitarist.
- the two characters therefore split the available polygons, and each character is rendered using a significantly smaller number of polygons than the 2048 polygon limit for each image layer.
- a second level of detail of an image layer may be available. For example, a zoomed in shot or close-up of one of the characters may be desired. A close-up of, for example, the lead guitarist character may be rendered when a particular task, such as a high score, has been achieved in the context of the video game.
- the second image 917 is an example of an image layer including only one character, a lead guitarist 919 , from the music based video game.
- the second image illustrates a second level of detail at which the front layer may be rendered, and may be used interchangeably with the first image as a front layer in the embodiments of the invention described herein. Because there is no lead singer in the second image, the 2048 polygons in the image layer may be completely dedicated to rendering the lead guitarist, and the lead guitarist at the second level of detail is rendered at a much higher resolution than the lead guitarist at the first level of detail.
- the invention may allow for the capability to generate different levels of detail for either one image layer alone, or for both image layers.
- generation of different levels of detail may only apply to rendering of the front layer.
- any modification to the level of detail of one rendered feature impacted the rendering process of the rest of the image, whether it be a different polygon allocation for the other image features, or a different set of pixels occluded by the feature at the second level of detail. Consequently, the entire image would have to be re-rendered.
- having different levels of detail for only one image layer allows for the second image layer to remain unmodified.
- the level of detail for the first image layer may be interchangeable and adjustable independent of the image rendering process of the second image layer. Therefore, for example, while the front layer in the embodiment illustrated in FIG. 9 may be switched between two different levels of detail, either front layer may be combined with the same back layer to create a composite screenshot of the video game. Furthermore, the level of detail of the front layer may be changed at any time without having to re-render the back layer to accommodate the front layer's change in level of detail.
- FIG. 10 is a further flow diagram of a process for providing images for a display, for example for a video game.
- the process may be performed, for example, by a hand-held video game device configured to execute a video game, for example a music based video game.
- the process determines whether to render objects associated with a first layer or a second layer. If the process determines to render objects associated with the first layer, the process proceeds to block 1013 . If the process determines to render objects associated with the second layer the processor proceeds to block 1019 . In most embodiments the process alternates between rendering of objects associated with the first layer and rendering of objects associated with the second layer. In various embodiments the process may maintain a flag, a register setting, or an index indicating whether to render objects of the first layer or the second layer, or which layer was last rendered.
- the process renders objects associated with the first layer.
- the process may perform the rendering of objects associated with the first layer by way of use of a graphics processing unit, which may be a separate chip or portion of a chip configured to process graphic information.
- a graphics processing unit which may be a separate chip or portion of a chip configured to process graphic information.
- the process stores information of the rendered objects associated with the first layer in a first memory, which may be considered a memory A.
- the process displays information stored in a second memory, which may be denoted as a memory B.
- the process swaps a render index.
- the purpose of the swapping of the render index is to indicate to the process that the process should thereafter render objects associated with a layer other than the objects associated with the layer just rendered. This may be done by way of an index, but many other ways of doing this may also be performed, for example a flag may be set, a register may be set, separate code sections may be used, or other methods may be used.
- block 1027 the process determines that the process should exit. If yes, the process thereafter exits. If no, the process returns to block 1011 .
- the process Upon returning to block 1011 the process again determines whether to render the objects associated with layer 1 or render the objects associated with layer 2 . Assuming, for the sake of example, that the process had previously rendered objects associated with layer 1 , the process proceeds to block 1019 . In block 1019 the process renders objects associated with layer 2 . In block 1021 the process layers rendered information of objects of layer 1 and the rendered information of objects of layer 2 . In some embodiments the layering may be performed by the graphics processing unit. In some embodiments, the rendering may be performed by a 3D render engine and the layering may be performed by a 2D graphics engine. In other embodiments the process may be performed by another processor.
- the layering of information may be performed as part of performance of operations of block 1023 , in which the layered information of objects associated with layer 1 and layer 2 are stored in the second memory, which may be denoted as memory B.
- the second memory which may be denoted as memory B.
- information associated with the rendered objects of layer 2 may be first stored in memory B, with the information stored in memory A thereafter but over writing information stored in memory B, thereby effectively occluding the information in layer B or vice versa.
- FIG. 11 illustrates a flow of displaying images on, for example, a hand-held device, in accordance with the process of FIG. 10 .
- the process renders objects associated with a first layer 1111 and stores the rendered information in a video memory A 1113 .
- the process displays on a display 1117 information in a video memory B 1115 .
- the information may be for a music based video game, with the display showing, for example, a note chart including graphical user instructions, a musician, and a background.
- the musician is an object associated with the first layer, and the remainder of the displayed objects are associated with the second layer.
- the note chart alone or in conjunction with the musician, is associated with the first layer, and the remainder of the displayed objects are associated with the second layer.
- the process renders objects associated with a second layer, and layers, which in some embodiments comprises combines, the information stored in video memory A with the rendered information of objects of the second layer.
- the process stores the layered information in video memory B.
- the process also, during odd frames, displays the information stored in video memory B on the display.
- the process renders objects associated with different layers.
- the process stores information that had previously been stored in video memory B, or displays information rendered during the frame and information previously stored in video memory A on the display.
- the invention therefore provides an image rendering process for, for example, a handheld video game device.
Abstract
Description
- The present invention relates generally to handheld devices, and more particularly to image rendering for a handheld video game device.
- Video games provide a source of enjoyment to users by allowing users to engage in simulated scenarios and situations the users may not otherwise be able to experience. Video games receive different types of interactive user inputs, and process the inputs into vibrant interactive visual displays and audio accompaniments for the users to enjoy.
- Handheld video game devices, or other mobile devices providing video game functions, are often preferred over traditional video game consoles due to their convenience and mobility. Because of the relatively small size of most handheld video game devices, handheld video game devices allow for easy transport and playing flexibility in environments which would typically be unsuitable for video game play using traditional video game consoles.
- The tradeoff for the small size and mobility of handheld video game devices is generally manifested in the processing power and video display capabilities of the handheld video game devices. The relatively small housings of handheld video game devices does not allow for the capacity of hardware and processing power of traditional video game consoles. In addition, the smaller platforms allow for only limited screen sizes, further reducing the video display capabilities of handheld video game devices. While recent years have seen marked improvements in the video display capabilities in a number of handheld video game device platforms, generally the capacity of video displays in handheld video game devices still falls far short of video display capabilities of more traditional video game consoles.
- The invention provides for displays of a video game. In one aspect the invention provides a method of providing images for a video game, comprising: associating objects with either a first layer or a second layer; rendering objects associated with the first layer; rendering objects associated with the second layer; displaying the rendered objects associated with the first layer on a display; and displaying the rendered objects associated with the second layer on the display.
- In another aspect the invention provides a method of providing images for a music based video game, comprising: associating a first object with a first display layer, the first object representative of a musician in the music based video game; associating a plurality of background objects with a second display layer, the plurality of background objects representative of a venue in the music based video game; iteratively rendering the first object, storing rendered information of the first object in a first memory, and displaying rendered information of the first object on the display; iteratively rendering the plurality of background objects, storing rendered information of the plurality of background objects in a second memory, and displaying rendered information of the plurality of background objects on the display; with displaying rendered information of the first object on the display utilizing the information stored in the first memory in first alternating time periods and displaying rendered information of the plurality of background objects utilizing the information stored in the second memory in second alternating time periods, the first alternating time periods and the second alternating time periods occurring at different times.
- In another aspect the invention provides a handheld game system, comprising: memory storing scene data, the scene data including first scene data and second scene data; a processor configured to render the first scene data and the second scene data; first video memory, coupled to the processor, configured to store rendered first scene data; second video memory, coupled to the processor, configured to store rendered second scene data; and a display coupled to the first video memory and the second video memory; the processor being further configured to alternately: a) render the first scene data, command display on the display of the rendered first scene data, command display on the display of the rendered second scene data in the second video memory, and command storage of the rendered first scene data in the first video memory, and b) render the second scene data, command display on the display of the rendered second scene data, command display on the display of the rendered first scene data in the first video memory, and command storage of the rendered second scene data in the second video memory.
- In another aspect the invention provides a method of providing images for a video game, comprising: associating different objects with different layers; rendering objects associated with a first layer of the different layers; storing information of rendered objects associated with the first layer in a first memory; rendering objects associated with a second layer of the different layers; combining the information of rendered objects associated with the first layer in the first memory with information of rendered objects associated with the second layer; and displaying the combined information.
- These and other aspects of the invention are more fully comprehended on review of this disclosure.
-
FIG. 1 illustrates a handheld video game system in accordance with aspects of the invention; -
FIG. 2 is a block diagram of example video processing circuitry for a display of a handheld video game device in accordance with aspects of the invention; -
FIG. 3 is a flow diagram of a process of image rendering in a handheld video game device in accordance with aspects of the invention; -
FIG. 4 is a timeline of an image layer alternating process in accordance with aspects of the invention; -
FIG. 5 is an illustration showing an image combining process in accordance with aspects of the invention; -
FIG. 6 is a block diagram of a handheld video game device in accordance with an embodiment of the invention; -
FIG. 7 is a flow diagram of a process of using separately rendered image layers in accordance with aspects of the invention; -
FIG. 8 illustrates a flow of using separately rendered image layers in accordance with aspects of the invention; -
FIG. 9 illustrates different levels of detail of rendered features in accordance with aspects of the invention; -
FIG. 10 is a flow diagram of a further process of using separately rendered image layers in accordance with aspects of the invention; and -
FIG. 11 illustrates a further flow of using separately rendered image layers in accordance with aspects of the invention. -
FIG. 1 is an example of a handheld video game system in accordance with aspects of the invention. The handheld video game system includes a handheldvideo game device 101, including at least one display, at least one user input device, at least one speaker, and at least one removable memory interface. Each handheld video game device also includes internal circuitry generally associated with video game devices, such as processing circuitry for executing video game instructions and memory used to store video game information. In various embodiments, another handheld device capable of providing video game functions may be used instead of the handheld video game device ofFIG. 1 . - In some embodiments, for example, the embodiment as illustrated in
FIG. 1 , the handheld video game device is a Nintendo DS handheld video game device or a Nintendo DS Lite handheld video game device, both widely available in consumer electronics retail stores. In these embodiments, the handheld video game device incorporates a clamshell design, with ahinge 103 allowing for closure of the handheld video game device, thereby protecting the external components of the handheld video game device while in the closed position. The handheld video game device includes two 3-inch displays, each with a resolution of 256×192 pixels, with afirst display 105 located on thetop portion 107 of the clamshell housing, and asecond display 109 located on thebottom portion 111 of the clamshell housing. In some embodiments, the bottom display may include touchscreen input capabilities. - In the embodiment as illustrated in
FIG. 1 ,speakers 113 are located on each side of the top display, while a digitaldirectional pad input 115 is located to the left of the bottom display, and a plurality ofdigital input buttons 117 is located to the right of the bottom display. In some embodiments, additional user inputs may be available on the handheld video game device. The handheld video game device also includes two removable memory interfaces. The firstremovable memory interface 119 is generally configured to read a removable memory cartridge holding video game instructions. The secondremovable memory interface 121 is configured to interact with either a removable memory cartridge used in conjunction with an older handheld video game device platform, or a peripheral used in conjunction with video game play of certain video games. - The handheld video game system illustrated in
FIG. 1 also includes aperipheral input device 123 with a plurality ofadditional input buttons 125. The peripheral input device may be connected to the handheld video game device via the second removable memory interface. In the embodiment as illustrated inFIG. 1 , the peripheral input device is used in conjunction with music based rhythm video game. - In the embodiment as illustrated in
FIG. 1 , the top display is displaying a screenshot from game play of a music based video game. As is consistent with the Nintendo DS and Nintendo DS Lite handheld video game devices, the graphics processing unit of the handheld video game device inFIG. 1 may be, for example, capable of rendering up to 2048 polygons per frame per image, at a rate of 60 frames per second. However, the screenshot displayed in the top display ofFIG. 1 includes double the allowable number of polygons, with for example a maximum display capacity of 4096 total polygons. In accordance with embodiments of the invention, each display of the handheld video game device is capable of displaying at least two separate image layers at the same time. By alternating the generation of two separate image layers, each with display capacity of 2048 polygons, and combining the two image layers, the resulting composite image may be double the traditional image resolution, although the effective frame rate may be reduced. A reduced frame rate, however, may be acceptable, as the new frame rate may be comparable with television and movie frame rates, which typically have refresh rates of between 24 and 30 frames per second. -
FIG. 2 is an example of a block diagram of the video processing circuitry for a display of a handheld video game device in accordance with aspects of the invention. The block diagram ofFIG. 2 may represent, for example, the circuitry associated with the top display of the handheld video game device as illustrated inFIG. 1 . Video display information is generated by processing video game instructions and user inputs. In some embodiments, a main processor housed in the handheld video game device may begin the video generation process by, for example, compiling and separating scene data and associated video generation information from other video game instructions, and storing the information into memory. - The scene data and other video generation information is stored in
scene data memory 211. In some embodiments, scene data memory may be included as an allocated portion of a main memory in the handheld video game device. In other embodiments, scene data memory may be separate memory in the handheld video game device, allocated specifically for storage of scene data and other video generation information. - A
graphics processing unit 213 retrieves scene data and other video generation information stored in the scene data memory. The graphics processing unit processes the information from scene data memory and renders images, for example 2D or 3D images, for display on the video display based on the information. In one embodiment, the graphics processing unit in the handheld video game device performs image rendering at a rate of 60 images per second. The graphics processing unit may alternate image rendering between what may be considered a front layer and what may be considered a back layer. In the context of a music video game, the front layer may be used, exclusively in some embodiments, to render images of a lead character or characters, for example a simulated lead guitarist and a simulated lead singer, while the back layer may be used to render images of a background environment, for example a venue, other band members, and remaining image details. In most embodiments, the front layer may be considered on top of the back layer, thereby blocking display of portions of the back layer. In other words, if a particular pixel in a composite image includes image information for both the front layer and the back layer, the image information for the back layer is occluded. As the lead guitarist and lead singer in the front layer are the main features of the video game footage, by using this arrangement, it is possible to render the lead guitarist and the lead singer in greater detail than the rest of the image. - In the embodiment of
FIG. 2 , images rendered by the graphics processing unit are sent directly to adisplay screen 215 for immediate display purposes, and are simultaneously captured and saved to a bank ofvideo memory 217. The video memory is memory dedicated to storing video processing information, and is generally capable of storing multiple completed video images at any given time. In some embodiments, the graphics processing unit renders images and sends the rendered images simultaneously to the display and video memory. In the embodiment ofFIG. 2 , a video memory A, orVRAM A 219, may be dedicated to storing front layer images rendered by the graphics processing unit, and a video memory B, orVRAM B 221, may be dedicated to storing back layer images rendered by the graphics processing unit. In this embodiment, subsequently rendered front layers overwrite a previous front layer stored in VRAM A, and subsequently rendered back layers overwrite a previous back layer stored in VRAM B. Therefore, when a front layer is rendered by the graphics processing unit, the front layer is updated and replaced on the display and in VRAM A, while the back layer displayed is the back layer previously stored in VRAM B. In the next frame, the graphics processing unit renders a new back layer, and the back layer is updated and replaced on both the display and in VRAM B, while the front layer displayed remains the front layer stored in VRAM A from the previous frame. - The graphics processing unit may continually alternates rendering front layers and back layers in this manner. Therefore, on even frames, for example, the display of the handheld video game device may display a newly updated front layer, and a back layer reused from the previous frame and retrieved from VRAM B. On odd frames, for example, the display of the handheld video game device may alternatively display a newly updated back layer, and a front layer reused from the previous frame and retrieved from VRAM A. As stated previously, the graphics processing unit of the handheld video game device may only be capable of rendering image layers with 2048 polygons. However, the graphics processing unit may be capable of combining two previously rendered image layers into one composite image including more than 2048 polygons. Likewise, the video memory associated with each graphics processing unit is capable of storing images containing more than 2048 polygons, and the display is capable of displaying images containing more than 2048 polygons. Using an alternating layer rendering approach as described, the handheld video game device is therefore capable of displaying videos with double the original resolution capacity of the handheld video game device.
-
FIG. 3 is a flow diagram of a process of image rendering in a handheld video game device in accordance with aspects of the invention. The process may be performed, for example, using the graphics processing unit of the top display in the embodiment as described inFIG. 1 . Inblock 311 the process associates objects with a layer. In some embodiments some objects may be associated with a first layer and some objects may be associated with a second layer. In some embodiments association of an object with a layer may be accomplished by storing information in memory correlating objects with a particular layer. In some embodiments the information may be stored in a table, for example, in memory separate from the objects. - In
block 311, the process processes scene data. Scene data may include, for example, video game instructions from a removable memory including information for running the particular video game being played. Scene data may also include, for example, user inputs generated through video game play, retrieved from, for example, user input apparatuses built into the handheld video game device, or from, for example, a peripheral device as illustrated inFIG. 1 , or from, for example, input signals from another handheld video game device via a wireless connection interface. Some data may also originate from the main processor of the handheld video game device, which also processes video game instructions and user inputs to generate, for example, video game states associated with the proper functionality of the video game, or from the main memory of the handheld video game device, which stores generated video game states that may include information on image rendering. - In
block 313, the process, usually by way of a graphics processing unit, renders objects within an image Generally, the process renders objects associated with different layers at different times. For example, using an example with two layers, the process may render objects associated with a first layer during a first time period, render objects associated with a second layer during a second time period, and repeat the rendering of objects in different layers in an alternating manner, thereby effectively rendering a first layer of an image and a second layer of an image in an alternating manner. - In
block 317, the process stores the information associated with each object generated during the image rendering process, or in other words, stores results of rendering objects. In some embodiments rendered object information for different layers are stored in different memory. In addition, in some embodiments the process may also command display of the rendered objects. In embodiments where images are rendered polygon by polygon, the process may generate a variety of information pertaining to each polygon. For example, the polygon shape and color is generated inblock 313, and the polygon layer identification information and exact display location of the polygon within the layer is generated inblock 315. After the rendered object information has been generated and compiled, it is stored in memory until the entire image layer has been successfully rendered. For example, the object information may be stored in the video memory associated with the graphics processing unit, or alternatively, the object information may be temporarily stored in the main memory of the handheld video game device. An image layer may be considered to be successfully rendered when all the objects to be rendered associated with the layer have been compiled, and the graphics processing unit can use the compilation of object information to render a completed image layer. - It should be recognized that in some embodiments the process performs the operations of
block 311, relating to association of objects with layers, prior to or when storing game data on, for example, a game cartridge or other memory storing video game instructions and data, and the process may thereafter repetitively perform the operations ofblocks - The process afterwards returns. The process may be repeated based on the object generation progress of the image layer being rendered and on the image rendering requirements of the graphics processing unit.
-
FIG. 4 is a timeline of the image layer alternating process in accordance with aspects of the invention. In the embodiment as illustrated inFIG. 4 ,image A1 411, representing the first front layer, is rendered and sent to the display when the process begins at frame 0. In accordance withFIG. 2 , A1 is also stored into VRAM A, the video memory slot allocated for storage of the most recent front layer image. - At the next frame,
frame 1, image B1 413, representing the first back layer, is rendered and displayed on the video display, and stored into VRAM B, the video memory slot allocated for storage of the most recent back layer image. In accordance with embodiments of the invention, the display of the handheld video game device is capable of displaying at least two image layers at the same time. Therefore, duringframe 1, image A1 is not replaced in VRAM A, and is instead recycled and combined with the new image B1 into a composite image making up the complete screenshot. - At the next frame,
frame 2,image A2 415, representing the second front layer, is rendered and displayed on the video display, while simultaneously stored into VRAM A. When stored into VRAM A, image A2 overwrites and replaces the previous image A1, so that there is only one front layer image stored in VRAM A at any given time. Inage B1, which is still stored in VRAM B, is reused, and image A2 is rendered on top of image B1, completing the screenshot. - At
frame 3,image B2 417, representing the second back layer, is rendered and displayed on the video display, and stored into VRAM B, overwriting the previous image B1. This process is similar to the storage process associated with VRAM A duringframe 2. The new image B2 is layered with image A2 to create the composite screenshot atframe 3. As can be seen inFIG. 4 , this process may be repeated, so that a new front layer image is rendered and stored on every even frame, and a new back layer image is rendered and stored on every odd frame. In an embodiment where 60 images are rendered per second, each layer, and consequently each complete screenshot, is only re-rendered 30 times per second. However, a refresh rate of 30 images per second is on par with, for example, television and movies, which typically employ a refresh rate of between 24 and 30 images per second. -
FIG. 5 is an illustration showing the image combining process in accordance with aspects of the invention. Layer A is a renderedfront layer 511, including alead singer 513 and alead guitarist 515. Layer A, as illustrated inFIG. 5 , can display a maximum of 2048 polygons. As the layer A illustrated does not include any background imagery, such as for example, backup singers and venue imagery, the 2048 polygons are used exclusively for rendering of the lead singer and the lead guitarist. In contrast, when the entire screenshot is rendered as one image, the 2048 polygons are used for the whole image, including the lead singer, the lead guitarist, and all of the associated background imagery. Thus, the invention allows for the lead singer and lead guitarist to be rendered in much greater detail than would be possible when the entire screenshot is rendered as one image. - Likewise, layer B is a rendered back
layer 517, including all the background imagery associated with the screenshot. Similar to layer A, layer B can display a maximum of 2048 polygons as well. In this fashion, the background imagery may utilize an increased number of polygons, as the polygons of the 2048 polygons that would otherwise be associated with the lead singer and/or lead guitarist can be used instead to enhance details in the background imagery. - The rendered layers A and B are combined into one
composite screenshot 519 including both layers. With both the front layer and the back layer capable of displaying up to 2048 polygons, the composite layer thereby has a maximum rendering capability of 4096 polygons, equivalent to double the video display resolution of traditional image rendering on similar handheld video game devices. In the embodiment as illustrated inFIG. 5 , the portions of layer B where features also exist in layer A are displayed asnull 521, creating silhouette shapes of the lead guitarist and lead singer in the illustrated layer B. In some embodiments, pixels from layer B are occluded if the pixels already contain objects in layer A, so that layer B contains null spaces upon rendering similar to the null spaces illustrated inFIG. 5 . The composite image is thus layered neatly upon combining of layers A and B, without any overlapping pixel information between the two layers. In other embodiments, layer B may be a fully rendered image layer, with the aforementioned null spaces also filled in with background image details. In these embodiments, front layer A may be laid atop back layer B for every frame, before being sent to the display, thereby covering and occluding all layer B pixels where layer A objects already exist. -
FIG. 6 is an example of a block diagram of the internal circuitry of a handheld video game device in accordance with another embodiment of the invention. The internal circuitry includes abus 601 coupling together aprocessor 603, amain memory 605, a graphics processing unit for afirst display 607 and a graphics processing unit for asecond display 609, anaudio driver 611, and a plurality of different interfaces. The interfaces may include a user input/output (I/O)interface 613, aremovable memory interface 615, aperipheral interface 617, and awireless communication interface 619. In the embodiment as illustrated inFIG. 6 , the graphics processing unit for the first display is coupled to a first allocation ofvideo memory 621 dedicated to the first display through adedicated bus 623. Likewise, the graphics processing unit for the second display is also coupled to a second allocation ofvideo memory 625 dedicated to the second display through anotherdedicated bus 627. In some embodiments, the graphics processing units may be integrated into the processor, so that the processor performs all graphics processing tasks. In other embodiments, there may be multiple processors, with each processor having separate data bus connections. - Handheld video game devices generally integrate displays, speakers, and user inputs directly into the handheld video game device.
FIG. 6 incorporates the integrated components, with the audio driver coupled to thespeakers 629 in the handheld video game device, the graphics processing unit and video memory for adisplay 1 coupled todisplay 1 631, the graphics processing unit and video memory for adisplay 2 coupled todisplay 2 633, and the user I/O reflected as an integrated component rather than as an interface. - The removable memory interface of the handheld video game device is configured to communicate with a removable memory, for example, a video game cartridge providing video game instructions related to the operation of a specific video game. The processor executes the video game instructions from the removable memory by communicating with each component, including the removable memory, via the bus. The main memory receives and stores information from the other components as needed for the video game to run properly. Stored information may include, for example, video game play instructions, input processing instructions, audio and video generation information, and configuration information from the removable memory, as well as user inputs from either the user I/O or the peripheral interface. The processor adjusts the video game's audio and video properties based in large part on a combination of video game processing instructions from the removable memory and the user inputs. A processor may also receive additional video game instructions and inputs from other handheld video game devices via the wireless communication interface, for example, during wireless multiplayer game play.
- The processor of the handheld video game device receives and processes the video game instructions and inputs, and generates audio and video information for the video game based on the instructions and inputs. The audio driver is configured to receive the audio information from the processor, and to translate the audio information into audio signals to be sent to the speakers.
- In an embodiment of a handheld video game device associated with the invention, the graphics processing unit renders images with a maximum resolution of 2048 polygons per image, at a frame rate of 60 images per second. The graphics processing unit for each display is configured to retrieve video generation information, and to translate the video generation information into display images to be sent to the display coupled to the graphics processing unit, to the video memory coupled to the graphics processing unit, or to both. In some embodiments, such as embodiments of the invention and the embodiment as illustrated in
FIG. 6 , each display may be configured to receive the display images from both a graphics processing unit and a video memory. In these embodiments, each display may retrieve image information from both the display's graphics processing unit and the display's video memory substantially simultaneously. If two images, each with a maximum rendering capacity of 2048 polygons, are combined and displayed together, the resulting composite image would consequently have a maximum resolution of 4096 polygons, double the original image rendering capacity. -
FIG. 7 is a flow diagram of a process of rendering separate image layers in accordance with aspects of the invention. Inblock 711, the process determines whether to perform image rendering. If the process determines not to render images, the process proceeds to block 729, and determines whether to exit the system or reinitiate the image rendering process. If, however, the process proceeds with image rendering, the image rendering process is performed and repeated. - In
block 713, the process determines whether to render a first image layer A or a second image layer B. In some embodiments, the first image layer A may be a front layer, and the second image layer B may be a back layer, where the front layer is always layered atop the back layer, and features of the front layer occlude features of the back layer at pixels where objects are rendered for both layers. In some embodiments, certain features associated with each screenshot will be grouped into either a front layer A or a back layer B. For example, in the embodiments associated with the invention, polygons used to generate images of the lead singer and the lead guitarist are grouped into front layer A, and the remaining polygons, which may be used to generate image details including backup singers or venue, may be grouped into back layer B. In other embodiments, polygons falling within a predefined portion of the display, for example, polygons located in the left half of the display image, may be included in layer A, and polygons falling outside the predefined portion of the display, the right half of the display image in this example, may be included in layer B. The process may include a render index to help determine whether to render layer A or layer B. The initial render index value may be arbitrary, or may be preset to either layer A or layer B. If layer A is to be rendered, the process proceeds to block 715. Similarly, if layer B is to be rendered, the process proceeds to block 721. - In
block 715, the process renders image layer A. The graphics processing unit retrieves video information associated with features included in image layer A, and processes the video information to generate polygon information for the image layer A features. Video information associated with image generation may be initially processed by the processor associated with the handheld video game device, and may include video game information stored in the removable memory and/or user input signals originating from input buttons, an attached peripheral, or a wireless communication interface. The video information may be temporarily stored in, and retrieved by a graphics processing unit from, the main memory of the handheld video game device. In embodiments associated with the invention, the process includes rendering a front layer A, including polygons used to generate a simulated lead singer and a simulated lead guitarist with respect to a music based video game. In other embodiments associated with other video games, different features of video display images may be grouped with and rendered as image layer A. - In
block 717, the process stores image layer A. Storage space may be provided by a video memory connected to the graphics processing unit. The video memory may have the capacity to store and sort multiple images simultaneously. In embodiments of the invention, the video memory is capable of storing at least two images in two separate memory allocations at any given time. The process stores the rendered image layer A into one of the memory slots, which may be labeled, for example, video memory slot A. In some embodiments of the invention, the rendering process ofblock 715 and the storing process ofblock 717 may be performed in conjunction with each other. In other words, when an object in an image layer is rendered during the rendering process, the newly rendered object may immediately be stored into the associated video memory slot before another object in the image layer is rendered. In other embodiments, the entire image layer may be rendered before storage of the completed image layer into the video memory slot A. If video memory slot A is occupied with a previously rendered image layer A, the previously rendered image layer A is overwritten and replaced by the newly rendered image layer A. - In
block 719, the process displays the rendered image layer A and a second image layer B stored in a second video memory slot dedicated to holding rendered layer B images. The process may combine the two image layers into one composite image before sending the display information to the video display. Alternatively, the process may send the image layers separately, and layer the images on top of one another on the display. If the process is in its first iteration, and no image layer B has yet been rendered, the process may display the rendered image layer A alone, or alternatively, the process may display a blank image layer B. - In
block 721, the process renders image layer B. The rendering process closely mirrors the rendering process for image layer A as described inblock 715. The graphics processing unit retrieves video information associated with features included in image layer B, and processes the video information to generate polygon information for the image layer B features. The features included in image layer B may be the features in a typical screenshot of the video game which were not rendered in image layer A. In the context of a music based video game, the image layer B may be a back layer B, which includes polygons associated with background imagery, for example, background singers, the remaining band members, and the venue. - In
block 723, the process stores image layer B. The storing process again closely mirrors the storage process for image layer A as described inblock 717. Image layer B may be stored in a separate memory allocation in a video memory slot B, or a similar memory allocation dedicated to the storage of newly rendered layer B images. In some embodiments, the image layer B storing process inblock 723 may be performed in conjunction with the image layer B rendering process inblock 721. In other embodiments, an entire image layer B may be rendered before storage into the video memory slot B. If video memory slot B already holds a previously rendered image layer B, the previously rendered image layer B is overwritten and replaced by the newly rendered image layer B. - In
block 725, the process displays on a video display a composite image including the rendered image layer B and the image layer A stored in video memory slot A. In most embodiments, the graphics processing unit recombines the rendered image layer B and the stored image layer A before sending the composite image to the video display. The image layer defined to be the front layer, generally image layer A as has been described herein, may be layered atop the image layer defined to be the back layer, generally image layer B in the described embodiments, thereby occluding objects in the back layer. In other words, pixels where both layers have object information will display the pixel information of the front layer, and the object information for the pixel included in the back layer will not be displayed. - In
block 727, the process swaps the render index. If image layer A was rendered and stored in the previous iteration, in other words, if the process performed the tasks associated withblocks blocks - In
block 729, the process determines whether to exit the image rendering process. If the process determines to remain in image rendering, the process cycles back to render either a new image layer A or a new image layer B, depending on the current render index. If the process determines to exit image rendering, the process returns. -
FIG. 8 is an illustration of an embodiment of rendering separate image layers in accordance with aspects of the invention. For both even frames and odd frames, afront layer 811 and aback layer 813 are used to generate acomposite image 815 making up a screenshot of the video game. The front image layer may include particular features of the screenshot, for example, polygons used to render thelead singer 817 and thelead guitarist 819 in the music based video game. The back image layer may include other features of the screenshot, for example, polygons used to generate the remaining aspects of the screenshot not rendered in the front image layer. In the embodiments as have been described herein, the back layer may include polygons used to render the backup singers, the remaining members of the band, and the stage or venue. Separate image layer rendering allows for more definition within each image layer, as the maximum number of polygons may be dedicated to the features of each layer rather than allocated across an entire screenshot of the video game. For example, the front image layer may dedicate all 2048 polygons in the image layer to rendering the lead guitarist and the lead singer, rather than allocating only a fraction of the 2048 polygons to the two main characters and using the rest of the polygons to generate the simulated surroundings around the two main characters. - In the embodiment of
FIG. 8 , on even frames, an image A 821 is rendered by the graphics processing unit of the handheld video game device. The newly rendered image A is both stored into a videomemory slot A 823, and used to generate the composite image making up a screenshot of the video game. In the embodiment as illustrated inFIG. 8 , the image A may be used as a front layer in the composite image screenshot, and may include, for example, polygons used to render the lead singer and lead guitarist, as has been previously described. During the even frames, the graphics processing unit also retrieves a stored image B stored in a videomemory slot B 825. In the embodiment as illustrated inFIG. 8 , the retrieved image B may be used as a back layer in the composite image screenshot, and may include, for example, polygons used to render features of the screenshot not rendered in image A. The newly rendered image A and the retrieved image B are combined into a composite image and sent to the video display as a complete screenshot of the video game. - During odd frames, a new image B 827 is rendered by the graphics processing unit. In the embodiment as illustrated in
FIG. 8 , the newly rendered image B is used in the composite screenshot of the video game as the back layer of the composite screenshot. During the odd frames, the newly rendered image B is also stored into video memory slot B for later retrieval and use, for example, for composite image generation during the even frames. During the odd frames, the graphics processing unit also retrieves the image A which was stored in the video memory slot A during the previous even frame. The graphics processing unit combines the retrieved image A and the newly rendered image B into a composite image, and sends the composite image to the video display as a complete screenshot of the video game. - In the embodiment as illustrated in
FIG. 8 , the alternating rendering process may be completed repeatedly at a rate of 60 frames per second, generating a new image A on even frames and a new image B on odd frames, thereby refreshing the entire screenshot every two frames. The effective refresh rate of the video display in a handheld video game device which renders 60 images per second is therefore 30 new screenshots per second. -
FIG. 9 is an illustration demonstrating different available levels of detail of rendered features in accordance with aspects of the invention. Thefirst image 911 is an example of an image layer, a front image layer in most embodiments of the invention as described herein, and includes two characters from a music based video game. The characters may represent alead guitarist 913 and alead singer 915. In rendering a front layer such as the first image, 2048 polygons are used to render the two characters. The lead guitarist may be rendered using a predefined portion of the available polygons, for example, 60% of the available polygons. The lead singer may be rendered using the remaining polygons not used to render the lead guitarist. The two characters therefore split the available polygons, and each character is rendered using a significantly smaller number of polygons than the 2048 polygon limit for each image layer. - In some situations, a second level of detail of an image layer may be available. For example, a zoomed in shot or close-up of one of the characters may be desired. A close-up of, for example, the lead guitarist character may be rendered when a particular task, such as a high score, has been achieved in the context of the video game. The
second image 917 is an example of an image layer including only one character, alead guitarist 919, from the music based video game. The second image illustrates a second level of detail at which the front layer may be rendered, and may be used interchangeably with the first image as a front layer in the embodiments of the invention described herein. Because there is no lead singer in the second image, the 2048 polygons in the image layer may be completely dedicated to rendering the lead guitarist, and the lead guitarist at the second level of detail is rendered at a much higher resolution than the lead guitarist at the first level of detail. - In various aspects, the invention may allow for the capability to generate different levels of detail for either one image layer alone, or for both image layers. In some embodiments, such as the embodiment as illustrated in
FIG. 9 , generation of different levels of detail may only apply to rendering of the front layer. Traditionally, when a single 2048 polygon image was rendered, any modification to the level of detail of one rendered feature impacted the rendering process of the rest of the image, whether it be a different polygon allocation for the other image features, or a different set of pixels occluded by the feature at the second level of detail. Consequently, the entire image would have to be re-rendered. In contrast, having different levels of detail for only one image layer allows for the second image layer to remain unmodified. The level of detail for the first image layer may be interchangeable and adjustable independent of the image rendering process of the second image layer. Therefore, for example, while the front layer in the embodiment illustrated inFIG. 9 may be switched between two different levels of detail, either front layer may be combined with the same back layer to create a composite screenshot of the video game. Furthermore, the level of detail of the front layer may be changed at any time without having to re-render the back layer to accommodate the front layer's change in level of detail. -
FIG. 10 is a further flow diagram of a process for providing images for a display, for example for a video game. The process may be performed, for example, by a hand-held video game device configured to execute a video game, for example a music based video game. - In
block 1011 the process determines whether to render objects associated with a first layer or a second layer. If the process determines to render objects associated with the first layer, the process proceeds to block 1013. If the process determines to render objects associated with the second layer the processor proceeds to block 1019. In most embodiments the process alternates between rendering of objects associated with the first layer and rendering of objects associated with the second layer. In various embodiments the process may maintain a flag, a register setting, or an index indicating whether to render objects of the first layer or the second layer, or which layer was last rendered. - In
block 1013 the process renders objects associated with the first layer. The process may perform the rendering of objects associated with the first layer by way of use of a graphics processing unit, which may be a separate chip or portion of a chip configured to process graphic information. Inblock 1015 the process stores information of the rendered objects associated with the first layer in a first memory, which may be considered a memory A. Inblock 1017 the process displays information stored in a second memory, which may be denoted as a memory B. - In
block 1025 the process swaps a render index. The purpose of the swapping of the render index is to indicate to the process that the process should thereafter render objects associated with a layer other than the objects associated with the layer just rendered. This may be done by way of an index, but many other ways of doing this may also be performed, for example a flag may be set, a register may be set, separate code sections may be used, or other methods may be used. - In
block 1027 the process determines that the process should exit. If yes, the process thereafter exits. If no, the process returns to block 1011. - Upon returning to block 1011 the process again determines whether to render the objects associated with
layer 1 or render the objects associated withlayer 2. Assuming, for the sake of example, that the process had previously rendered objects associated withlayer 1, the process proceeds to block 1019. Inblock 1019 the process renders objects associated withlayer 2. Inblock 1021 the process layers rendered information of objects oflayer 1 and the rendered information of objects oflayer 2. In some embodiments the layering may be performed by the graphics processing unit. In some embodiments, the rendering may be performed by a 3D render engine and the layering may be performed by a 2D graphics engine. In other embodiments the process may be performed by another processor. In some embodiments the layering of information may be performed as part of performance of operations ofblock 1023, in which the layered information of objects associated withlayer 1 andlayer 2 are stored in the second memory, which may be denoted as memory B. For example, in some embodiments information associated with the rendered objects oflayer 2 may be first stored in memory B, with the information stored in memory A thereafter but over writing information stored in memory B, thereby effectively occluding the information in layer B or vice versa. - The process then again goes to block 1017 and displays the information stored in memory B, and thereafter continues as previously discussed.
-
FIG. 11 illustrates a flow of displaying images on, for example, a hand-held device, in accordance with the process ofFIG. 10 . As may be seen inFIG. 11 , during even frames the process renders objects associated with afirst layer 1111 and stores the rendered information in avideo memory A 1113. Also during even frames, the process displays on adisplay 1117 information in avideo memory B 1115. The information may be for a music based video game, with the display showing, for example, a note chart including graphical user instructions, a musician, and a background. In some embodiments the musician is an object associated with the first layer, and the remainder of the displayed objects are associated with the second layer. In some embodiments, the note chart, alone or in conjunction with the musician, is associated with the first layer, and the remainder of the displayed objects are associated with the second layer. - During odd frames, the process renders objects associated with a second layer, and layers, which in some embodiments comprises combines, the information stored in video memory A with the rendered information of objects of the second layer. The process stores the layered information in video memory B. The process also, during odd frames, displays the information stored in video memory B on the display.
- Thus, in every other frame, the process renders objects associated with different layers. In addition, in alternating frames, the process stores information that had previously been stored in video memory B, or displays information rendered during the frame and information previously stored in video memory A on the display.
- The invention therefore provides an image rendering process for, for example, a handheld video game device. Although the invention has been described with respect to certain embodiments, it should be recognized that the invention may be practiced other than as specifically described, the invention comprising the claims and their insubstantial variations supported by this disclosure.
Claims (22)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/136,563 US20090305782A1 (en) | 2008-06-10 | 2008-06-10 | Double render processing for handheld video game device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/136,563 US20090305782A1 (en) | 2008-06-10 | 2008-06-10 | Double render processing for handheld video game device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090305782A1 true US20090305782A1 (en) | 2009-12-10 |
Family
ID=41400810
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/136,563 Abandoned US20090305782A1 (en) | 2008-06-10 | 2008-06-10 | Double render processing for handheld video game device |
Country Status (1)
Country | Link |
---|---|
US (1) | US20090305782A1 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110310121A1 (en) * | 2008-08-26 | 2011-12-22 | Pure Depth Limited | Multi-layered displays |
US20120094773A1 (en) * | 2010-10-15 | 2012-04-19 | Nintendo Co., Ltd. | Storage medium having stored thereon game program, image processing apparatus, image processing system, and image processing method |
US20120306855A1 (en) * | 2011-06-03 | 2012-12-06 | Nintendo Co., Ltd. | Storage medium having stored therein display control program, display control apparatus, display control method, and display control system |
US20140307043A1 (en) * | 2011-11-10 | 2014-10-16 | Esaturnus | Ultra Low Latency Video Communication |
CN104548595A (en) * | 2013-10-24 | 2015-04-29 | 索尼电脑娱乐公司 | Capture execution/non-execution control device, capture execution/non-execution control method, program, and storage medium |
US20160225350A1 (en) * | 2015-02-03 | 2016-08-04 | Dong-han Lee | Image combination device and display system comprising the same |
US9808724B2 (en) | 2010-09-20 | 2017-11-07 | Activision Publishing, Inc. | Music game software and input device utilizing a video player |
CN104548595B (en) * | 2013-10-24 | 2018-02-09 | 索尼电脑娱乐公司 | capture execution/non-execution control device, control method, program and storage medium |
CN108600668A (en) * | 2018-03-27 | 2018-09-28 | 维沃移动通信有限公司 | A kind of record screen frame per second method of adjustment and mobile terminal |
US20180376097A1 (en) * | 2015-08-21 | 2018-12-27 | Beijing Kingsoft Internet Sercurity Software Co., Ltd. | Image Generation Method and Device |
USD1004566S1 (en) * | 2019-12-24 | 2023-11-14 | Saregama India Limited | Media player |
Citations (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5355762A (en) * | 1990-09-25 | 1994-10-18 | Kabushiki Kaisha Koei | Extemporaneous playing system by pointing device |
US5408911A (en) * | 1991-03-04 | 1995-04-25 | Lyrrus, Inc. | Musical instrument string |
US6008820A (en) * | 1995-08-04 | 1999-12-28 | Microsoft Corporation | Processor for controlling the display of rendered image layers and method for controlling same |
US6028583A (en) * | 1998-01-16 | 2000-02-22 | Adobe Systems, Inc. | Compound layers for composited image manipulation |
US6100906A (en) * | 1998-04-22 | 2000-08-08 | Ati Technologies, Inc. | Method and apparatus for improved double buffering |
US6266068B1 (en) * | 1998-03-13 | 2001-07-24 | Compaq Computer Corporation | Multi-layer image-based rendering for video synthesis |
US6326964B1 (en) * | 1995-08-04 | 2001-12-04 | Microsoft Corporation | Method for sorting 3D object geometry among image chunks for rendering in a layered graphics rendering system |
US6369830B1 (en) * | 1999-05-10 | 2002-04-09 | Apple Computer, Inc. | Rendering translucent layers in a display system |
US6380935B1 (en) * | 1999-03-17 | 2002-04-30 | Nvidia Corporation | circuit and method for processing render commands in a tile-based graphics system |
US6390923B1 (en) * | 1999-11-01 | 2002-05-21 | Konami Corporation | Music playing game apparatus, performance guiding image display method, and readable storage medium storing performance guiding image forming program |
US6514083B1 (en) * | 1998-01-07 | 2003-02-04 | Electric Planet, Inc. | Method and apparatus for providing interactive karaoke entertainment |
US6856323B2 (en) * | 2001-04-09 | 2005-02-15 | Weather Central, Inc. | Layered image rendering |
US20050255914A1 (en) * | 2004-05-14 | 2005-11-17 | Mchale Mike | In-game interface with performance feedback |
US6971882B1 (en) * | 1998-01-07 | 2005-12-06 | Electric Planet, Inc. | Method and apparatus for providing interactive karaoke entertainment |
US20060009979A1 (en) * | 2004-05-14 | 2006-01-12 | Mchale Mike | Vocal training system and method with flexible performance evaluation criteria |
US20060058101A1 (en) * | 2004-09-16 | 2006-03-16 | Harmonix Music Systems, Inc. | Creating and selling a music-based video game |
US7042467B1 (en) * | 2000-05-16 | 2006-05-09 | Adobe Systems Incorporated | Compositing using multiple backdrops |
US20060100021A1 (en) * | 2004-03-31 | 2006-05-11 | Nintendo Co., Ltd. | Game console and emulator for the game console |
US7184059B1 (en) * | 2000-08-23 | 2007-02-27 | Nintendo Co., Ltd. | Graphics system with copy out conversions between embedded frame buffer and main memory |
US20070060346A1 (en) * | 2005-06-28 | 2007-03-15 | Samsung Electronics Co., Ltd. | Tool for video gaming system and method |
US20070291037A1 (en) * | 2006-06-01 | 2007-12-20 | Blaukopf Jacob B | Apparatus and method for selectively double buffering portions of displayable content |
US20080034292A1 (en) * | 2006-08-04 | 2008-02-07 | Apple Computer, Inc. | Framework for graphics animation and compositing operations |
US20080082549A1 (en) * | 2006-10-02 | 2008-04-03 | Vic Baker | Multi-Dimensional Web-Enabled Data Viewer |
US20080102951A1 (en) * | 2006-11-01 | 2008-05-01 | Nintendo Co., Ltd. | Storage medium storing a game program, game apparatus, and game control method |
US20080113698A1 (en) * | 2006-11-15 | 2008-05-15 | Harmonix Music Systems, Inc. | Method and apparatus for facilitating group musical interaction over a network |
US20080117217A1 (en) * | 2003-11-19 | 2008-05-22 | Reuven Bakalash | Multi-mode parallel graphics rendering system employing real-time automatic scene profiling and mode control |
US20080129747A1 (en) * | 2003-11-19 | 2008-06-05 | Reuven Bakalash | Multi-mode parallel graphics rendering system employing real-time automatic scene profiling and mode control |
US20080145027A1 (en) * | 2007-05-31 | 2008-06-19 | Tomoyuki Okada | Recording medium, playback apparatus, recording method, program, and playback method |
US7528830B2 (en) * | 2003-09-17 | 2009-05-05 | Koninklijke Philips Electronics N.V. | System and method for rendering 3-D images on a 3-D image display screen |
US7535480B2 (en) * | 2005-08-24 | 2009-05-19 | Microsoft Corporation | Compositing rendering layers |
US7542042B1 (en) * | 2004-11-10 | 2009-06-02 | Nvidia Corporation | Subpicture overlay using fragment shader |
US7547260B2 (en) * | 2005-06-28 | 2009-06-16 | Mooney Bert E | Batting cage |
US8003872B2 (en) * | 2006-03-29 | 2011-08-23 | Harmonix Music Systems, Inc. | Facilitating interaction with a music-based video game |
US8201102B2 (en) * | 2007-09-04 | 2012-06-12 | Apple Inc. | Opaque views for graphical user interfaces |
-
2008
- 2008-06-10 US US12/136,563 patent/US20090305782A1/en not_active Abandoned
Patent Citations (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5355762A (en) * | 1990-09-25 | 1994-10-18 | Kabushiki Kaisha Koei | Extemporaneous playing system by pointing device |
US5408911A (en) * | 1991-03-04 | 1995-04-25 | Lyrrus, Inc. | Musical instrument string |
US6326964B1 (en) * | 1995-08-04 | 2001-12-04 | Microsoft Corporation | Method for sorting 3D object geometry among image chunks for rendering in a layered graphics rendering system |
US6008820A (en) * | 1995-08-04 | 1999-12-28 | Microsoft Corporation | Processor for controlling the display of rendered image layers and method for controlling same |
US6971882B1 (en) * | 1998-01-07 | 2005-12-06 | Electric Planet, Inc. | Method and apparatus for providing interactive karaoke entertainment |
US6514083B1 (en) * | 1998-01-07 | 2003-02-04 | Electric Planet, Inc. | Method and apparatus for providing interactive karaoke entertainment |
US6028583A (en) * | 1998-01-16 | 2000-02-22 | Adobe Systems, Inc. | Compound layers for composited image manipulation |
US6266068B1 (en) * | 1998-03-13 | 2001-07-24 | Compaq Computer Corporation | Multi-layer image-based rendering for video synthesis |
US6100906A (en) * | 1998-04-22 | 2000-08-08 | Ati Technologies, Inc. | Method and apparatus for improved double buffering |
US6380935B1 (en) * | 1999-03-17 | 2002-04-30 | Nvidia Corporation | circuit and method for processing render commands in a tile-based graphics system |
US6369830B1 (en) * | 1999-05-10 | 2002-04-09 | Apple Computer, Inc. | Rendering translucent layers in a display system |
US6390923B1 (en) * | 1999-11-01 | 2002-05-21 | Konami Corporation | Music playing game apparatus, performance guiding image display method, and readable storage medium storing performance guiding image forming program |
US7042467B1 (en) * | 2000-05-16 | 2006-05-09 | Adobe Systems Incorporated | Compositing using multiple backdrops |
US7184059B1 (en) * | 2000-08-23 | 2007-02-27 | Nintendo Co., Ltd. | Graphics system with copy out conversions between embedded frame buffer and main memory |
US6856323B2 (en) * | 2001-04-09 | 2005-02-15 | Weather Central, Inc. | Layered image rendering |
US7528830B2 (en) * | 2003-09-17 | 2009-05-05 | Koninklijke Philips Electronics N.V. | System and method for rendering 3-D images on a 3-D image display screen |
US20080129747A1 (en) * | 2003-11-19 | 2008-06-05 | Reuven Bakalash | Multi-mode parallel graphics rendering system employing real-time automatic scene profiling and mode control |
US20080117217A1 (en) * | 2003-11-19 | 2008-05-22 | Reuven Bakalash | Multi-mode parallel graphics rendering system employing real-time automatic scene profiling and mode control |
US20060100021A1 (en) * | 2004-03-31 | 2006-05-11 | Nintendo Co., Ltd. | Game console and emulator for the game console |
US20050255914A1 (en) * | 2004-05-14 | 2005-11-17 | Mchale Mike | In-game interface with performance feedback |
US7806759B2 (en) * | 2004-05-14 | 2010-10-05 | Konami Digital Entertainment, Inc. | In-game interface with performance feedback |
US20060009979A1 (en) * | 2004-05-14 | 2006-01-12 | Mchale Mike | Vocal training system and method with flexible performance evaluation criteria |
US20060058101A1 (en) * | 2004-09-16 | 2006-03-16 | Harmonix Music Systems, Inc. | Creating and selling a music-based video game |
US7542042B1 (en) * | 2004-11-10 | 2009-06-02 | Nvidia Corporation | Subpicture overlay using fragment shader |
US20070060346A1 (en) * | 2005-06-28 | 2007-03-15 | Samsung Electronics Co., Ltd. | Tool for video gaming system and method |
US7547260B2 (en) * | 2005-06-28 | 2009-06-16 | Mooney Bert E | Batting cage |
US7535480B2 (en) * | 2005-08-24 | 2009-05-19 | Microsoft Corporation | Compositing rendering layers |
US8003872B2 (en) * | 2006-03-29 | 2011-08-23 | Harmonix Music Systems, Inc. | Facilitating interaction with a music-based video game |
US20070291037A1 (en) * | 2006-06-01 | 2007-12-20 | Blaukopf Jacob B | Apparatus and method for selectively double buffering portions of displayable content |
US20080034292A1 (en) * | 2006-08-04 | 2008-02-07 | Apple Computer, Inc. | Framework for graphics animation and compositing operations |
US20080082549A1 (en) * | 2006-10-02 | 2008-04-03 | Vic Baker | Multi-Dimensional Web-Enabled Data Viewer |
US20080102951A1 (en) * | 2006-11-01 | 2008-05-01 | Nintendo Co., Ltd. | Storage medium storing a game program, game apparatus, and game control method |
US7775867B2 (en) * | 2006-11-01 | 2010-08-17 | Nintendo Co., Ltd. | Storage medium storing a game program, game apparatus, and game control method |
US20080113698A1 (en) * | 2006-11-15 | 2008-05-15 | Harmonix Music Systems, Inc. | Method and apparatus for facilitating group musical interaction over a network |
US7758427B2 (en) * | 2006-11-15 | 2010-07-20 | Harmonix Music Systems, Inc. | Facilitating group musical interaction over a network |
US20080145027A1 (en) * | 2007-05-31 | 2008-06-19 | Tomoyuki Okada | Recording medium, playback apparatus, recording method, program, and playback method |
US8201102B2 (en) * | 2007-09-04 | 2012-06-12 | Apple Inc. | Opaque views for graphical user interfaces |
Non-Patent Citations (1)
Title |
---|
Badawy et al., A low power and high performance core for planar object overlaying, 1999, IEEE, pages 621-624 * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8941691B2 (en) * | 2008-08-26 | 2015-01-27 | Pure Depth Limited | Multi-layered displays |
US20110310121A1 (en) * | 2008-08-26 | 2011-12-22 | Pure Depth Limited | Multi-layered displays |
US9808724B2 (en) | 2010-09-20 | 2017-11-07 | Activision Publishing, Inc. | Music game software and input device utilizing a video player |
US10434420B2 (en) | 2010-09-20 | 2019-10-08 | Activision Publishing, Inc. | Music game software and input device utilizing a video player |
US20120094773A1 (en) * | 2010-10-15 | 2012-04-19 | Nintendo Co., Ltd. | Storage medium having stored thereon game program, image processing apparatus, image processing system, and image processing method |
US20120306855A1 (en) * | 2011-06-03 | 2012-12-06 | Nintendo Co., Ltd. | Storage medium having stored therein display control program, display control apparatus, display control method, and display control system |
US20140307043A1 (en) * | 2011-11-10 | 2014-10-16 | Esaturnus | Ultra Low Latency Video Communication |
USRE49077E1 (en) * | 2011-11-10 | 2022-05-17 | Esaturnus | Ultra low latency video communication |
US9264663B2 (en) * | 2011-11-10 | 2016-02-16 | Esaturnus | Ultra low latency video communication |
CN104548595B (en) * | 2013-10-24 | 2018-02-09 | 索尼电脑娱乐公司 | capture execution/non-execution control device, control method, program and storage medium |
US10068615B2 (en) * | 2013-10-24 | 2018-09-04 | Sony Interactive Entertainment Inc. | Capture execution/non-execution control device, capture execution/non-execution control method, program, and information storage medium |
US20150116293A1 (en) * | 2013-10-24 | 2015-04-30 | Sony Computer Entertainment Inc. | Capture execution/non-execution control device, capture execution/non-execution control method, program, and information storage medium |
CN104548595A (en) * | 2013-10-24 | 2015-04-29 | 索尼电脑娱乐公司 | Capture execution/non-execution control device, capture execution/non-execution control method, program, and storage medium |
US20160225350A1 (en) * | 2015-02-03 | 2016-08-04 | Dong-han Lee | Image combination device and display system comprising the same |
US10490168B2 (en) * | 2015-02-03 | 2019-11-26 | Samsung Electronics Co., Ltd. | Image combination device and display system comprising the same |
US11030976B2 (en) | 2015-02-03 | 2021-06-08 | Samsung Electronics Co., Ltd. | Image combination device and display system comprising the same |
US20180376097A1 (en) * | 2015-08-21 | 2018-12-27 | Beijing Kingsoft Internet Sercurity Software Co., Ltd. | Image Generation Method and Device |
US10484639B2 (en) * | 2015-08-21 | 2019-11-19 | Beijing Kingsoft Internet Security Software Co., Ltd. | Image generation method and device |
CN108600668A (en) * | 2018-03-27 | 2018-09-28 | 维沃移动通信有限公司 | A kind of record screen frame per second method of adjustment and mobile terminal |
USD1004566S1 (en) * | 2019-12-24 | 2023-11-14 | Saregama India Limited | Media player |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090305782A1 (en) | Double render processing for handheld video game device | |
US8159507B2 (en) | Display device, display method, information recording medium and program | |
US8602888B2 (en) | Video game device and image processing program | |
US7317459B2 (en) | Graphics system with copy out conversions between embedded frame buffer and main memory for producing a streaming video image as a texture on a displayed object image | |
JP3725524B2 (en) | Method for generating computer display image and computer processing system and graphics processor for generating image data | |
US6937245B1 (en) | Graphics system with embedded frame buffer having reconfigurable pixel formats | |
US7697015B2 (en) | Storage medium and game device storing image generating program | |
JP2000132706A (en) | Recording medium, image processor and image processing method | |
WO2005109345A1 (en) | Display, displaying method, information recording medium, and program | |
JP3617839B2 (en) | GAME SOUND CONTROL PROGRAM, GAME SOUND CONTROL METHOD, AND GAME DEVICE | |
JP4305903B2 (en) | Image generation system, program, and information storage medium | |
JP2005319029A (en) | Program, information storage medium, and image generating system | |
JP4749198B2 (en) | Program, information storage medium, and image generation system | |
JP3639286B2 (en) | GAME PROGRAM AND GAME DEVICE | |
JP3990258B2 (en) | Image generation system, program, and information storage medium | |
JP4447000B2 (en) | Image generation system, program, and information storage medium | |
JP4651204B2 (en) | Image generation system, program, and information storage medium | |
US7173618B2 (en) | Image creation program and method of creating image | |
JP4637199B2 (en) | Image processing apparatus, image processing method, and program | |
JP3779717B2 (en) | GAME PROGRAM AND GAME DEVICE | |
JP3822882B2 (en) | GAME PROGRAM AND GAME DEVICE | |
ES2640192T3 (en) | Data processing | |
JP4447417B2 (en) | GAME DEVICE, PROGRAM, AND COMPUTER CONTROL METHOD | |
JP4693153B2 (en) | Image generation system, program, and information storage medium | |
JP2002042176A (en) | Game system and information storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ACTIVISION PUBLISHING, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OBERG, GREGORY KEITH;BOOTH, JESSE NATHANIEL;REEL/FRAME:021325/0903 Effective date: 20080618 |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., WASHINGTON Free format text: SECURITY AGREEMENT;ASSIGNOR:ACTIVISION BLIZZARD, INC.;REEL/FRAME:031435/0138 Effective date: 20131011 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: ACTIVISION ENTERTAINMENT HOLDINGS, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487 Effective date: 20161014 Owner name: BLIZZARD ENTERTAINMENT, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487 Effective date: 20161014 Owner name: ACTIVISION ENTERTAINMENT HOLDINGS, INC., CALIFORNI Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487 Effective date: 20161014 Owner name: ACTIVISION BLIZZARD INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487 Effective date: 20161014 Owner name: ACTIVISION PUBLISHING, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487 Effective date: 20161014 |